Posts Tagged ‘YouTube’

A multimedia collaboration between photographer Platon and UNHCR launched

March 30, 2023

On 27 March 2023 “Portrait of a Stranger,” a creative multimedia collaboration between world-renowned photographer and storyteller Platon, and UNHCR, was launched in partnership with the Movies That Matter International Human Rights Film Festival in The Hague, Netherlands. 

The 18-minute film features interviews and portraits of over 20 refugees who fled conflict and persecution in various parts of the world, exploring the universal desire to be free, safe, respected and valued, and to belong.

Over the last year, UNHCR and Platon interviewed a diverse group of refugees ranging in age, nationality, ethnicity and personal circumstances. The result, Portrait of a Stranger, is a holistic, multimedia experience, marrying film and photography. It asks audiences to look beyond our differences and instead focus on our shared humanity. 

“Living in exile may be their life circumstance, but it is not what defines them,” said Platon. “I hope the images and voices of the refugees in this film will help audiences focus on the shared humanity that unites us, rather than the barriers that divide us. Not only for these particular refugees but for all people forced to flee around the world.”

As the number of people forcibly displaced continues to rise – last year there were more than 100 million people uprooted globally – it is hoped that the collaboration will help to reframe the narratives and perceptions around people forced to flee.  

This film and these images are powerful reminders of who refugees really are. They are people like your neighbour, your friend, your colleague. Like you and me, each with our own personality; our hopes; our dreams,” United Nations High Commissioner for Refugees, Filippo Grandi, said. “By amplifying the voices of refugees, the film offers an important reality check to counter the negative public discourse we often hear about people forced to flee. 

About Platon:  

Photographer, communicator and storyteller Platon has gained worldwide fame with his portraits. Platon has worked with a range of international publications including Rolling Stone, Vanity Fair, Esquire, and won a Peabody Award for his photo essays for The New Yorker. He has photographed over 30 covers for TIME Magazine and is a World Press Photo laureate. He is currently on the board for Arts and Culture at the World Economic Forum. In 2013, Platon founded The People’s Portfolio, a non-profit foundation dedicated to celebrating emerging leaders of human rights and civil rights around the world.  See also: https://humanrightsdefenders.blog/2015/02/25/photographer-platon-speaks-about-human-rights-in-indiana-wells-on-february-27

https://www.unhcr.org/news/press/2023/3/642175f64/unhcr-platon-launch-collaboration-bring-refugee-voices-aspirations-focus.html

Posdcast with Ketty Nivyabandi, poet & woman human rights defender from Burundi

July 20, 2020

Ketty Nivyabandi is a Burundian activist and poet who led the first women-only demonstrations against Burundi’s president in 2015. She defied police beatings, tear gas, and a water cannon to make women’s voices heard.

In this podcast THE HUMAN RIGHTS FOUNDATION dives into Burundi’s authoritarian regime and Ketty’s resistance to Burundi’s dictatorship. What role can women play in protesting and organizing? How do you survive police brutality? How can people remain hopeful and support protestors in Burundi?

https://www.youtube.com/watch?v=XfJuctTwAuA&feature=youtu.be

Algorithms designed to suppress ISIS content, may also suppress evidence of human rights violations

April 11, 2020
Facebook and YouTube designed algorithms to suppress ISIS content. They're having unexpected side effects.

Illustration by Leo Acadia for TIME
TIME of 11 April 2020 carries a long article by Billy Perrigo entitled “These Tech Companies Managed to Eradicate ISIS Content. But They’re Also Erasing Crucial Evidence of War Crimes” It is a very interseting piece that clearly spells out the dilemma of supressing too much or too little on Facebook, YouTube, etc.  Algorithms designed to suppress ISIS content, are having unexpected side effects such as suppressing evidence of human rights violations.
…..Images by citizen journalist Abo Liath Aljazarawy to his Facebook page (Eye on Alhasakah’s) showed the ground reality of the Syrian civil war. His page was banned. Facebook confirmed to TIME that Eye on Alhasakah was flagged in late 2019 by its algorithms, as well as users, for sharing “extremist content.” It was then funneled to a human moderator, who decided to remove it. After being notified by TIME, Facebook restored the page in early February, some 12 weeks later, saying the moderator had made a mistake. (Facebook declined to say which specific videos were wrongly flagged, except that there were several.)The algorithms were developed largely in reaction to ISIS, who shocked the world in 2014 when they began to share slickly-produced online videos of executions and battles as propaganda. Because of the very real way these videos radicalized viewers, the U.S.-led coalition in Iraq and Syria worked overtime to suppress them, and enlisted social networks to help. Quickly, the companies discovered that there was too much content for even a huge team of humans to deal with. (More than 500 hours of video are uploaded to YouTube every minute.) So, since 2017, beg have been using algorithms to automatically detect extremist content. Early on, those algorithms were crude, and only supplemented the human moderators’ work. But now, following three years of training, they are responsible for an overwhelming proportion of detections. Facebook now says more than 98% of content removed for violating its rules on extremism is flagged automatically. On YouTube, across the board, more than 20 million videos were taken down before receiving a single view in 2019. And as the coronavirus spread across the globe in early 2020, Facebook, YouTube and Twitter announced their algorithms would take on an even larger share of content moderation, with human moderators barred from taking sensitive material home with them.

But algorithms are notoriously worse than humans at understanding one crucial thing: context. Now, as Facebook and YouTube have come to rely on them more and more, even innocent photos and videos, especially from war zones, are being swept up and removed. Such content can serve a vital purpose for both civilians on the ground — for whom it provides vital real-time information — and human rights monitors far away. In 2017, for the first time ever, the International Criminal Court in the Netherlands issued a war-crimes indictment based on videos from Libya posted on social media. And as violence-detection algorithms have developed, conflict monitors are noticing an unexpected side effect, too: these algorithms could be removing evidence of war crimes from the Internet before anyone even knows it exists.

…..
It was an example of how even one mistaken takedown can make the work of human rights defenders more difficult. Yet this is happening on a wider scale: of the 1.7 million YouTube videos preserved by Syrian Archive, a Berlin-based non-profit that downloads evidence of human rights violations, 16% have been removed. A huge chunk were taken down in 2017, just as YouTube began using algorithms to flag violent and extremist content. And useful content is still being removed on a regular basis. “We’re still seeing that this is a problem,” says Jeff Deutsch, the lead researcher at Syrian Archive. “We’re not saying that all this content has to remain public forever. But it’s important that this content is archived, so it’s accessible to researchers, to human rights groups, to academics, to lawyers, for use in some kind of legal accountability.” (YouTube says it is working with Syrian Archive to improve how they identify and preserve footage that could be useful for human rights groups.)

…..

Facebook and YouTube’s detection systems work by using a technology called machine learning, by which colossal amounts of data (in this case, extremist images, videos, and their metadata) are fed to an artificial intelligence adept at spotting patterns. Early types of machine learning could be trained to identify images containing a house, or a car, or a human face. But since 2017, Facebook and YouTube have been feeding these algorithms content that moderators have flagged as extremist — training them to automatically identify beheadings, propaganda videos and other unsavory content.

Both Facebook and YouTube are notoriously secretive about what kind of content they’re using to train the algorithms responsible for much of this deletion. That means there’s no way for outside observers to know whether innocent content — like Eye on Alhasakah’s — has already been fed in as training data, which would compromise the algorithm’s decision-making. In the case of Eye on Alhasakah’s takedown, “Facebook said, ‘oops, we made a mistake,’” says Dia Kayyali, the Tech and Advocacy coordinator at Witness, a human rights group focused on helping people record digital evidence of abuses. “But what if they had used the page as training data? Then that mistake has been exponentially spread throughout their system, because it’s going to train the algorithm more, and then more of that similar content that was mistakenly taken down is going to get taken down. I think that is exactly what’s happening now.” Facebook and YouTube, however, both deny this is possible. Facebook says it regularly retrains its algorithms to avoid this happening. In a statement, YouTube said: “decisions made by human reviewers help to improve the accuracy of our automated flagging systems.”

…….
That’s because Facebook’s policies allow some types of violence and extremism but not others — meaning decisions on whether to take content down is often based on cultural context. Has a video of an execution been shared by its perpetrators to spread fear? Or by a citizen journalist to ensure the wider world sees a grave human rights violation? A moderator’s answer to those questions could mean that of two identical videos, one remains online and the other is taken down. “This technology can’t yet effectively handle everything that is against our rules,” Saltman said. “Many of the decisions we have to make are complex and involve decisions around intent and cultural nuance which still require human eye and judgement.”

In this balancing act, it’s Facebook’s army of human moderators — many of them outsourced contractors — who carry the pole. And sometimes, they lose their footing. After several of Eye on Alhasakah’s posts were flagged by algorithms and humans alike, a Facebook moderator wrongly decided the page should be banned entirely for sharing violent videos in order to praise them — a violation of Facebook’s rules on violence and extremism, which state that some content can remain online if it is newsworthy, but not if it encourages violence or valorizes terrorism. The nuance, Facebook representatives told TIME, is important for balancing freedom of speech with a safe environment for its users — and keeping Facebook on the right side of government regulations.

Facebook’s set of rules on the topic reads like a gory textbook on ethics: beheadings, decomposed bodies, throat-slitting and cannibalism are all classed as too graphic, and thus never allowed; neither is dismemberment — unless it’s being performed in a medical setting; nor burning people, unless they are practicing self-immolation as an act of political speech, which is protected. Moderators are given discretion, however, if violent content is clearly being shared to spread awareness of human rights abuses. “In these cases, depending on how graphic the content is, we may allow it, but we place a warning screen in front of the content and limit the visibility to people aged 18 or over,” said Saltman. “We know not everyone will agree with these policies and we respect that.”

But civilian journalists operating in the heat of a civil war don’t always have time to read the fine print. And conflict monitors say it’s not enough for Facebook and YouTube to make all the decisions themselves. “Like it or not, people are using these social media platforms as a place of permanent record,” says Woods. “The social media sites don’t get to choose what’s of value and importance.”

See also: https://humanrightsdefenders.blog/2019/06/17/social-media-councils-an-answer-to-problems-of-content-moderation-and-distribution/

https://time.com/5798001/facebook-youtube-algorithms-extremism/

Reminder: for verification of YouTube videos there is a Citizen Evidence Lab tool

November 13, 2018

With the avalanche of fake news and the BBC doing an interesting series on this topic, it is good to remind you that there is a too that can help human rights defenders to check the veracity of YouTube videos. It was published on 8 July 2014.
During crises or disasters, YouTube is widely used to share footage—including a host of videos that are old or, in some cases, staged or faked. An enormous challenge for human rights workers, journalists or first responders alike is to separate fact from fiction. Now, there’s a website that can help with this. The Citizen Evidence Lab (http://citizenevidence.org/) is the first dedicated verification resource for human rights workers, providing tools for speedy checks on YouTube videos as well as for more advanced assessment. (Video produced by Gaby Sferra)
https://www.bollyinside.com/citizen-evidence-lab-how-to-authenticate-youtube-videos-bollyinside/

Profile of Yaxue Cao of ChinaChange.org

February 9, 2018

On 9 November 2017 ISHR met Yaxue Cao, the founder and editor of ChinaChange.org, an English-language website devoted to news and commentary related to civil society, rule of law, and human rights activities in China. She works to help the rest of the world understand what people are thinking and doing to effect change in China. Reports and translations on China Change have been cited widely in leading global news outlets and in U.S. Congressional reports. Yaxue Cao grew up in northern China during the Cultural Revolution and studied literature in the US. She lives in Washington, DC.

10 Films Every Human Rights Defender Should Watch in HR Watch

May 31, 2014

I announced the HRW film festival in an earlier post [https://thoolen.wordpress.com/2014/05/13/human-rights-watch-film-festival-celebrates-25th-anniversary-with-5-films-on-human-rights-defenders/] but now that the Huffington post of 31 May 2014 has listed the 10 films it says every human rights defender should see, I gladly share their pick:

1. Sepideh — Reaching for the Stars (Denmark/Iran/Germany/Norway/Sweden) The story of a teenage girl named Sepideh, living in a rural village outside of Tehran, who dreams of becoming a famous astronomer. The documentary tackles gender roles in Iran while showcasing one young woman’s ambition and strength in the face of her family’s discouragement, university pitfalls and societal expectations. Directed by Berit Madsen. [youtube https://www.youtube.com/watch?v=wTzbIc6oiqs?wmode=opaque]

2. Dangerous Acts Starring the Unstable Elements of Belarus (US/UK/Belarus) Made up of smuggled footage and uncensored interviews, this documentary gives audiences a glimpse into Belarus’ dissident movement as it takes the shape of stage performances and public activism. Directed by Madeleine Sackler. [youtube https://www.youtube.com/watch?v=LGALySJ3O24?wmode=opaque]

3. Lady Valor: The Kristin Beck Story (US) A veteran shares her story moving from one identity, a former U.S. Navy Seal named Chris Beck, to another, a transgender woman named Kristen Beck. Directed by Sandrine Orabona. [youtube https://www.youtube.com/watch?v=r21OdLSTfQY?wmode=opaque]

4. A Quiet Inquisition (US) Here you’ll meet OBGYN Dr. Carla Cerrato, who must navigate the perilous territory of Nicaragua’s anti-abortion policies, which prohibit abortion, even in cases of rape, incest, or when a woman’s life is at stake. Directed by Alessandra Zeka and Holen Sabrina Kahn.

5. Scheherazade’s Diary (Lebanon) This “tragicomic documentary” follows women inmates in Lebanon as they stage a theater/drama therapy project titled “Scheherazade in Baabda,” revealing personal stories of domestic violence, failed relationships and traumas associated with motherhood. Directed by Zeina Daccache. [youtube https://www.youtube.com/watch?v=5VnZGmd6EMg?wmode=opaque]

6. Siddharth (Canada/India) One father’s desperate journey to locate his son, a 12-year-old boy who was sent to work in another province to support his family, but did not return and is feared to have been kidnapped or trafficked. Directed by Richie Mehta.

7. The Supreme Price (US) The film covers the evolution of the Pro-Democracy Movement in Nigeria and efforts to increase the participation of women in leadership roles. Directed by Joanna Lipper.

8. Private Violence (US) Questioning the accepted discourse on domestic violence, the documentary introduces audiences to two women survivors who advocate for justice while exploring “the fact that the most dangerous place for a woman is her home.” Directed by Cynthia Hill.

9. The Beekeeper (Switzerland) This is the touching story of Ibrahim Gezer, a Kurdish beekeeper from southeast Turkey who, robbed of his family, possessions and 500 bee colonies, moves to Switzerland to make a new life. Directed by Mano Khalil.

10. Abounaddara Collective Shorts from Syria (Syria) The Abounaddara Collective is a group of filmmakers who came together in 2010 to help provide an alternative image of Syrian society, one not seen in mainstream media. This portion of the festival will showcase 90 minutes of their short films.

 

The Human Rights Watch Film Festival will run from June 12 to June 22, 2014. See a complete schedule of screenings here.

10 Films Every Human Rights Advocate Should Watch.

Video on journalist Eskinder Nega in detention in Ethiopia

December 5, 2013

Ethiopian journalist Eskinder Nega is serving an 18-year prison sentence for “terrorism”. He was charged in 2011 after giving speeches and writing articles criticizing the government and supporting free speech. He is a Amnesty prisoner of conscience. Eskinder has long been a thorn in the side of the Ethiopian authorities. He has previously been harassed, arrested and prosecuted a number of times for his writing. Between 2006 and 2007, Eskinder and his wife Serkalem Fasil were detained and tried on treason and other charges along with 129 other journalists, opposition politicians and activists. Serkalem gave birth to their son Nafkot while in prison. In May 2013, Eskinder wrote from prison: “I will live to see the light at the end of the tunnel. It may or may not be a long wait. Whichever way events may go, I shall persevere!”

 

Development of Amnesty’s Panic Button App

September 11, 2013

Having last week referred to 3 different (and competing?) techno initiatives to increase the security of HRDs, i would be amiss not to note the post of 11 september  2013 by Tanya O’Caroll on the AI blog concerning  the development of the Panic button. Over the next couple of months, she will be keeping you posted about the Panic Button. If you want to join the community of people working on Panic Button, please leave a comment on the site mentioned below or email panicbutton@amnesty.org.

via Inside the development of Amnesty’s new Panic Button App | Amnestys global human rights blog.

 

AI 2013 report comes with short video

May 29, 2013

Amnesty International’s 2013 report comes with an introductory video which shows governments are using the excuse of ‘internal affairs’ in shameful attempts to block concerted international action to resolve human rights emergencies.

Related articles

1 year of Human Rights Channel on YouTube: 90 countries. 1,892 videos

May 27, 2013

Twelve months ago, Witness and its partners at Storyful launched the first dedicated space on YouTube for verified citizen video on human rights issues. Screen Shot 2013-05-20 at 4.54.46 PM Read the rest of this entry »