Posts Tagged ‘evidence’

Witness’ animated film “We Have Rights” to be used when documenting ICE Arrests

August 27, 2020

It wil take only 3 minutes to watch this well-done animated film “We Have Rights When Documenting ICE Arrests” which Witness co-created for the We Have Rights Campaign.   

 

Algorithms designed to suppress ISIS content, may also suppress evidence of human rights violations

April 11, 2020

Facebook and YouTube designed algorithms to suppress ISIS content. They're having unexpected side effects.

Illustration by Leo Acadia for TIME
TIME of 11 April 2020 carries a long article by Billy Perrigo entitled “These Tech Companies Managed to Eradicate ISIS Content. But They’re Also Erasing Crucial Evidence of War Crimes” It is a very interseting piece that clearly spells out the dilemma of supressing too much or too little on Facebook, YouTube, etc.  Algorithms designed to suppress ISIS content, are having unexpected side effects such as suppressing evidence of human rights violations.
…..Images by citizen journalist Abo Liath Aljazarawy to his Facebook page (Eye on Alhasakah’s) showed the ground reality of the Syrian civil war. His page was banned. Facebook confirmed to TIME that Eye on Alhasakah was flagged in late 2019 by its algorithms, as well as users, for sharing “extremist content.” It was then funneled to a human moderator, who decided to remove it. After being notified by TIME, Facebook restored the page in early February, some 12 weeks later, saying the moderator had made a mistake. (Facebook declined to say which specific videos were wrongly flagged, except that there were several.)The algorithms were developed largely in reaction to ISIS, who shocked the world in 2014 when they began to share slickly-produced online videos of executions and battles as propaganda. Because of the very real way these videos radicalized viewers, the U.S.-led coalition in Iraq and Syria worked overtime to suppress them, and enlisted social networks to help. Quickly, the companies discovered that there was too much content for even a huge team of humans to deal with. (More than 500 hours of video are uploaded to YouTube every minute.) So, since 2017, beg have been using algorithms to automatically detect extremist content. Early on, those algorithms were crude, and only supplemented the human moderators’ work. But now, following three years of training, they are responsible for an overwhelming proportion of detections. Facebook now says more than 98% of content removed for violating its rules on extremism is flagged automatically. On YouTube, across the board, more than 20 million videos were taken down before receiving a single view in 2019. And as the coronavirus spread across the globe in early 2020, Facebook, YouTube and Twitter announced their algorithms would take on an even larger share of content moderation, with human moderators barred from taking sensitive material home with them.

But algorithms are notoriously worse than humans at understanding one crucial thing: context. Now, as Facebook and YouTube have come to rely on them more and more, even innocent photos and videos, especially from war zones, are being swept up and removed. Such content can serve a vital purpose for both civilians on the ground — for whom it provides vital real-time information — and human rights monitors far away. In 2017, for the first time ever, the International Criminal Court in the Netherlands issued a war-crimes indictment based on videos from Libya posted on social media. And as violence-detection algorithms have developed, conflict monitors are noticing an unexpected side effect, too: these algorithms could be removing evidence of war crimes from the Internet before anyone even knows it exists.

…..
It was an example of how even one mistaken takedown can make the work of human rights defenders more difficult. Yet this is happening on a wider scale: of the 1.7 million YouTube videos preserved by Syrian Archive, a Berlin-based non-profit that downloads evidence of human rights violations, 16% have been removed. A huge chunk were taken down in 2017, just as YouTube began using algorithms to flag violent and extremist content. And useful content is still being removed on a regular basis. “We’re still seeing that this is a problem,” says Jeff Deutsch, the lead researcher at Syrian Archive. “We’re not saying that all this content has to remain public forever. But it’s important that this content is archived, so it’s accessible to researchers, to human rights groups, to academics, to lawyers, for use in some kind of legal accountability.” (YouTube says it is working with Syrian Archive to improve how they identify and preserve footage that could be useful for human rights groups.)

…..

Facebook and YouTube’s detection systems work by using a technology called machine learning, by which colossal amounts of data (in this case, extremist images, videos, and their metadata) are fed to an artificial intelligence adept at spotting patterns. Early types of machine learning could be trained to identify images containing a house, or a car, or a human face. But since 2017, Facebook and YouTube have been feeding these algorithms content that moderators have flagged as extremist — training them to automatically identify beheadings, propaganda videos and other unsavory content.

Both Facebook and YouTube are notoriously secretive about what kind of content they’re using to train the algorithms responsible for much of this deletion. That means there’s no way for outside observers to know whether innocent content — like Eye on Alhasakah’s — has already been fed in as training data, which would compromise the algorithm’s decision-making. In the case of Eye on Alhasakah’s takedown, “Facebook said, ‘oops, we made a mistake,’” says Dia Kayyali, the Tech and Advocacy coordinator at Witness, a human rights group focused on helping people record digital evidence of abuses. “But what if they had used the page as training data? Then that mistake has been exponentially spread throughout their system, because it’s going to train the algorithm more, and then more of that similar content that was mistakenly taken down is going to get taken down. I think that is exactly what’s happening now.” Facebook and YouTube, however, both deny this is possible. Facebook says it regularly retrains its algorithms to avoid this happening. In a statement, YouTube said: “decisions made by human reviewers help to improve the accuracy of our automated flagging systems.”

…….
That’s because Facebook’s policies allow some types of violence and extremism but not others — meaning decisions on whether to take content down is often based on cultural context. Has a video of an execution been shared by its perpetrators to spread fear? Or by a citizen journalist to ensure the wider world sees a grave human rights violation? A moderator’s answer to those questions could mean that of two identical videos, one remains online and the other is taken down. “This technology can’t yet effectively handle everything that is against our rules,” Saltman said. “Many of the decisions we have to make are complex and involve decisions around intent and cultural nuance which still require human eye and judgement.”

In this balancing act, it’s Facebook’s army of human moderators — many of them outsourced contractors — who carry the pole. And sometimes, they lose their footing. After several of Eye on Alhasakah’s posts were flagged by algorithms and humans alike, a Facebook moderator wrongly decided the page should be banned entirely for sharing violent videos in order to praise them — a violation of Facebook’s rules on violence and extremism, which state that some content can remain online if it is newsworthy, but not if it encourages violence or valorizes terrorism. The nuance, Facebook representatives told TIME, is important for balancing freedom of speech with a safe environment for its users — and keeping Facebook on the right side of government regulations.

Facebook’s set of rules on the topic reads like a gory textbook on ethics: beheadings, decomposed bodies, throat-slitting and cannibalism are all classed as too graphic, and thus never allowed; neither is dismemberment — unless it’s being performed in a medical setting; nor burning people, unless they are practicing self-immolation as an act of political speech, which is protected. Moderators are given discretion, however, if violent content is clearly being shared to spread awareness of human rights abuses. “In these cases, depending on how graphic the content is, we may allow it, but we place a warning screen in front of the content and limit the visibility to people aged 18 or over,” said Saltman. “We know not everyone will agree with these policies and we respect that.”

But civilian journalists operating in the heat of a civil war don’t always have time to read the fine print. And conflict monitors say it’s not enough for Facebook and YouTube to make all the decisions themselves. “Like it or not, people are using these social media platforms as a place of permanent record,” says Woods. “The social media sites don’t get to choose what’s of value and importance.”

See also: https://humanrightsdefenders.blog/2019/06/17/social-media-councils-an-answer-to-problems-of-content-moderation-and-distribution/

https://time.com/5798001/facebook-youtube-algorithms-extremism/

Physicians for Human Rights gets Dodd human rights award

February 4, 2017

Physicians for Human Rights, an organization that for decades has documented war crimes and atrocities, will be awarded the Thomas J. Dodd Prize in International Justice and Human Rights, the University of Connecticut announced on 2 February 2017. “Physicians for Human Rights exemplifies the kind of work the Dodd Prize was created to honor,” former U.S. Sen. Christopher J. Dodd, the son of the Nuremberg prosecutor and senator for whom the awarded is named, said in a statement.”My father would recognize in PHR the same spirit that animated the Nuremberg Tribunals, but also would be amazed at PHR’s innovation and courage in seeking justice and accountability for the perpetrators of atrocities,”

Using forensic science, medicine and public health research, Physicians for Human Rights documents crimes against humanity in places across the world, including past issues in Bosnia and the Democratic Republic of the Congo, UConn said in announcing the award. The group also trains professionals worldwide to do the similar investigations and prevention, the announcement said. PHR shared the 1997 Nobel Peace Prize for work on the International Campaign to Ban Landmines.

PHR will be presented the award, which comes with a $100,000 prize, in November this year.

Source: Physicians For Human Rights To Receive Dodd Prize – Hartford Courant

DiploHack event on human rights to be held in Geneva on 26-27 February

February 24, 2016

The Permanent Mission of the Netherlands in Geneva, THE Port Association (https://twitter.com/theportatcern) and Impact Hub Geneva will host their hackathon in the field of human rights, on the 26/27 February 2016. The Human Rights DiploHack event will bring together diplomats and human rights experts with tech developers, designers, innovators and entrepreneurs from all over Europe and beyond, to experiment and innovate on projects that directly impact people’s lives. From the multidisciplinary expertise of the participants, teams will be formed to work on two challenges presented by the Office of the High Commissioner for Human Rights (OHCHR):

  • “How can Human Right Defenders collect and transport evidence in a safe way?”, and
  • “Does a photo or video tell the ‘truth?”.

[as a first contribution I refer to the video as evidence instructions posted by Witness on 18 February 2016

The result will be presented at the Palais des Nations, on 29 February during a side event open to the public on the occasion of the Human Rights Council (for accreditation to this side event, non-UN-badge holders are invited to contact the organizers before 24 February). True Heroes Films (THF) will be filming the event and will produce a short film to be shown at the side event.THF_SIMPLE

http://www.diplohack.org/geneva-diplohack-for-human-rights.html

Are human rights videos making a difference?

September 3, 2014

Yvette Alberdingk Thijm, the Executive Director of WITNESS, posted an important piece in the Huffington Post of 2 September on how to make sure that the increase in human rights videos uploaded to Witness (and the same for other NGOs) make a real difference. After citing several examples of such footage of violence, conflict, and human rights abuses, she reflects as follows: “When I watch these videos with such potential to transform human rights advocacy, I am concerned about the gaps and the lost opportunities: the videos that cannot be authenticated; the stories that will be denied or thrown out of court — or worse, will never reach their intended audience; a survivor’s account lost in a visual sea of citizen media. Mostly, I worry about the safety of the person who filmed, about her privacy and security.

…….

“When WITNESS was created, we talked about the power of video to “open the eyes of the world to human rights violations.” Today, our collective eyes have been opened to many of the conflicts and abuses that are going on around us. This creates, for all of us, a responsibility to engage. I am deeply convinced that citizen documentation has the power to transform human rights advocacy, change behaviors, and increase accountability. But let’s make sure that all of us filming have the right tools and capabilities, and that we apply and share the lessons we are learning from citizen witnesses around the world, so that more people filming truly equals more rights.”

How Do We Ensure That More People Using Video Equals More Rights? | Yvette Alberdingk Thijm.

Using Video for Documentation and Evidence: on-line course by New Tactics from 21 July

July 7, 2014

Citizen media

(Photo credit: WITNESS, used under Creative Commons)

Kelly Matheson of WITNESS and the New Tactics community organise an online conversation on the Using Video for Documentation and Evidence from 21 to 25 July, 2014. User-generated content can be instrumental in drawing attention to human rights abuses. But many filmers and activists want their videos to do more. They have the underlying expectation that footage exposing abuse can help bring about justice. Unfortunately, the quality of citizen video and other content rarely passes the higher bar needed to function as evidence in a court of law. This online discussion is an opportunity for practitioners of law, technology and human rights to share their experiences, challenges, tools and ideas to help increase the chances that the footage citizens and activists often risk their lives to capture can do more than expose injustice – it can also serve as evidence in the criminal and civil justice processes.

Using Video for Documentation and Evidence | New Tactics in Human Rights.

Archiving video should not be a dirty word for Human Rights Defenders

January 22, 2014

This blog has often referred to the growing role of images in the protection of human rights. The Activists Guide to Archiving Video produced by the NGO Witness is one tool that can greatly help those who want to be part of this development. The term “archive” may turn off many human rights defenders as something boring or at least not deserving priority but to neglect it would be a big error. As the Witness guide explains very clearly:

  • Do you want your videos to be available in the future?
  • Do you want your videos to serve as evidence of crimes or human rights abuses?
  • Do you want your videos to raise awareness and educate future generations?

The risks of not archiving are big:

  1. Your videos may exist somewhere, but no one can find them.
  2. Someone may find your videos, but cannot understand what they are about.
  3. Your videos cannot be sufficiently authenticated or corroborated as evidence.
  4. Your videos’ quality may become so degraded that no one can use them.
  5. Your videos may be in a format that eventually no one can play.
  6. Your videos may be accidentally or deliberately deleted and lost forever.

In further sections the Guide help to understand how videos can be made accessible (shared) and brings clarity to tricky issues such as the different formats and copyright.

Worth a visit!!

Activists Guide to Archiving Video | archiveguide.witness.org.

1 year of Human Rights Channel on YouTube: 90 countries. 1,892 videos

May 27, 2013

Twelve months ago, Witness and its partners at Storyful launched the first dedicated space on YouTube for verified citizen video on human rights issues. Screen Shot 2013-05-20 at 4.54.46 PM Read the rest of this entry »