It wil take only 3 minutes to watch this well-done animated film “We Have Rights When Documenting ICE Arrests” which Witness co-created for the We Have Rights Campaign.
share information on human rights defenders, with special focus on human rights awards and laureates
It wil take only 3 minutes to watch this well-done animated film “We Have Rights When Documenting ICE Arrests” which Witness co-created for the We Have Rights Campaign.
But algorithms are notoriously worse than humans at understanding one crucial thing: context. Now, as Facebook and YouTube have come to rely on them more and more, even innocent photos and videos, especially from war zones, are being swept up and removed. Such content can serve a vital purpose for both civilians on the ground — for whom it provides vital real-time information — and human rights monitors far away. In 2017, for the first time ever, the International Criminal Court in the Netherlands issued a war-crimes indictment based on videos from Libya posted on social media. And as violence-detection algorithms have developed, conflict monitors are noticing an unexpected side effect, too: these algorithms could be removing evidence of war crimes from the Internet before anyone even knows it exists.
…..
It was an example of how even one mistaken takedown can make the work of human rights defenders more difficult. Yet this is happening on a wider scale: of the 1.7 million YouTube videos preserved by Syrian Archive, a Berlin-based non-profit that downloads evidence of human rights violations, 16% have been removed. A huge chunk were taken down in 2017, just as YouTube began using algorithms to flag violent and extremist content. And useful content is still being removed on a regular basis. “We’re still seeing that this is a problem,” says Jeff Deutsch, the lead researcher at Syrian Archive. “We’re not saying that all this content has to remain public forever. But it’s important that this content is archived, so it’s accessible to researchers, to human rights groups, to academics, to lawyers, for use in some kind of legal accountability.” (YouTube says it is working with Syrian Archive to improve how they identify and preserve footage that could be useful for human rights groups.)
…..
Facebook and YouTube’s detection systems work by using a technology called machine learning, by which colossal amounts of data (in this case, extremist images, videos, and their metadata) are fed to an artificial intelligence adept at spotting patterns. Early types of machine learning could be trained to identify images containing a house, or a car, or a human face. But since 2017, Facebook and YouTube have been feeding these algorithms content that moderators have flagged as extremist — training them to automatically identify beheadings, propaganda videos and other unsavory content.
Both Facebook and YouTube are notoriously secretive about what kind of content they’re using to train the algorithms responsible for much of this deletion. That means there’s no way for outside observers to know whether innocent content — like Eye on Alhasakah’s — has already been fed in as training data, which would compromise the algorithm’s decision-making. In the case of Eye on Alhasakah’s takedown, “Facebook said, ‘oops, we made a mistake,’” says Dia Kayyali, the Tech and Advocacy coordinator at Witness, a human rights group focused on helping people record digital evidence of abuses. “But what if they had used the page as training data? Then that mistake has been exponentially spread throughout their system, because it’s going to train the algorithm more, and then more of that similar content that was mistakenly taken down is going to get taken down. I think that is exactly what’s happening now.” Facebook and YouTube, however, both deny this is possible. Facebook says it regularly retrains its algorithms to avoid this happening. In a statement, YouTube said: “decisions made by human reviewers help to improve the accuracy of our automated flagging systems.”
…….
That’s because Facebook’s policies allow some types of violence and extremism but not others — meaning decisions on whether to take content down is often based on cultural context. Has a video of an execution been shared by its perpetrators to spread fear? Or by a citizen journalist to ensure the wider world sees a grave human rights violation? A moderator’s answer to those questions could mean that of two identical videos, one remains online and the other is taken down. “This technology can’t yet effectively handle everything that is against our rules,” Saltman said. “Many of the decisions we have to make are complex and involve decisions around intent and cultural nuance which still require human eye and judgement.”
In this balancing act, it’s Facebook’s army of human moderators — many of them outsourced contractors — who carry the pole. And sometimes, they lose their footing. After several of Eye on Alhasakah’s posts were flagged by algorithms and humans alike, a Facebook moderator wrongly decided the page should be banned entirely for sharing violent videos in order to praise them — a violation of Facebook’s rules on violence and extremism, which state that some content can remain online if it is newsworthy, but not if it encourages violence or valorizes terrorism. The nuance, Facebook representatives told TIME, is important for balancing freedom of speech with a safe environment for its users — and keeping Facebook on the right side of government regulations.
Facebook’s set of rules on the topic reads like a gory textbook on ethics: beheadings, decomposed bodies, throat-slitting and cannibalism are all classed as too graphic, and thus never allowed; neither is dismemberment — unless it’s being performed in a medical setting; nor burning people, unless they are practicing self-immolation as an act of political speech, which is protected. Moderators are given discretion, however, if violent content is clearly being shared to spread awareness of human rights abuses. “In these cases, depending on how graphic the content is, we may allow it, but we place a warning screen in front of the content and limit the visibility to people aged 18 or over,” said Saltman. “We know not everyone will agree with these policies and we respect that.”
But civilian journalists operating in the heat of a civil war don’t always have time to read the fine print. And conflict monitors say it’s not enough for Facebook and YouTube to make all the decisions themselves. “Like it or not, people are using these social media platforms as a place of permanent record,” says Woods. “The social media sites don’t get to choose what’s of value and importance.”
https://time.com/5798001/facebook-youtube-algorithms-extremism/
Physicians for Human Rights, an organization that for decades has documented war crimes and atrocities, will be awarded the Thomas J. Dodd Prize in International Justice and Human Rights, the University of Connecticut announced on 2 February 2017. “Physicians for Human Rights exemplifies the kind of work the Dodd Prize was created to honor,” former U.S. Sen. Christopher J. Dodd, the son of the Nuremberg prosecutor and senator for whom the awarded is named, said in a statement.”My father would recognize in PHR the same spirit that animated the Nuremberg Tribunals, but also would be amazed at PHR’s innovation and courage in seeking justice and accountability for the perpetrators of atrocities,”
Using forensic science, medicine and public health research, Physicians for Human Rights documents crimes against humanity in places across the world, including past issues in Bosnia and the Democratic Republic of the Congo, UConn said in announcing the award. The group also trains professionals worldwide to do the similar investigations and prevention, the announcement said. PHR shared the 1997 Nobel Peace Prize for work on the International Campaign to Ban Landmines.
PHR will be presented the award, which comes with a $100,000 prize, in November this year.
Source: Physicians For Human Rights To Receive Dodd Prize – Hartford Courant
The Permanent Mission of the Netherlands in Geneva, THE Port Association (https://twitter.com/theportatcern) and Impact Hub Geneva will host their hackathon in the field of human rights, on the 26/27 February 2016. The Human Rights DiploHack event will bring together diplomats and human rights experts with tech developers, designers, innovators and entrepreneurs from all over Europe and beyond, to experiment and innovate on projects that directly impact people’s lives. From the multidisciplinary expertise of the participants, teams will be formed to work on two challenges presented by the Office of the High Commissioner for Human Rights (OHCHR):
[as a first contribution I refer to the video as evidence instructions posted by Witness on 18 February 2016
The result will be presented at the Palais des Nations, on 29 February during a side event open to the public on the occasion of the Human Rights Council (for accreditation to this side event, non-UN-badge holders are invited to contact the organizers before 24 February). True Heroes Films (THF) will be filming the event and will produce a short film to be shown at the side event.
http://www.diplohack.org/geneva-diplohack-for-human-rights.html
Yvette Alberdingk Thijm, the Executive Director of WITNESS, posted an important piece in the Huffington Post of 2 September on how to make sure that the increase in human rights videos uploaded to Witness (and the same for other NGOs) make a real difference. After citing several examples of such footage of violence, conflict, and human rights abuses, she reflects as follows: “When I watch these videos with such potential to transform human rights advocacy, I am concerned about the gaps and the lost opportunities: the videos that cannot be authenticated; the stories that will be denied or thrown out of court — or worse, will never reach their intended audience; a survivor’s account lost in a visual sea of citizen media. Mostly, I worry about the safety of the person who filmed, about her privacy and security.”
…….
“When WITNESS was created, we talked about the power of video to “open the eyes of the world to human rights violations.” Today, our collective eyes have been opened to many of the conflicts and abuses that are going on around us. This creates, for all of us, a responsibility to engage. I am deeply convinced that citizen documentation has the power to transform human rights advocacy, change behaviors, and increase accountability. But let’s make sure that all of us filming have the right tools and capabilities, and that we apply and share the lessons we are learning from citizen witnesses around the world, so that more people filming truly equals more rights.”
How Do We Ensure That More People Using Video Equals More Rights? | Yvette Alberdingk Thijm.
(Photo credit: WITNESS, used under Creative Commons)
Kelly Matheson of WITNESS and the New Tactics community organise an online conversation on the Using Video for Documentation and Evidence from 21 to 25 July, 2014. User-generated content can be instrumental in drawing attention to human rights abuses. But many filmers and activists want their videos to do more. They have the underlying expectation that footage exposing abuse can help bring about justice. Unfortunately, the quality of citizen video and other content rarely passes the higher bar needed to function as evidence in a court of law. This online discussion is an opportunity for practitioners of law, technology and human rights to share their experiences, challenges, tools and ideas to help increase the chances that the footage citizens and activists often risk their lives to capture can do more than expose injustice – it can also serve as evidence in the criminal and civil justice processes.
Using Video for Documentation and Evidence | New Tactics in Human Rights.
This blog has often referred to the growing role of images in the protection of human rights. The Activists Guide to Archiving Video produced by the NGO Witness is one tool that can greatly help those who want to be part of this development. The term “archive” may turn off many human rights defenders as something boring or at least not deserving priority but to neglect it would be a big error. As the Witness guide explains very clearly:
The risks of not archiving are big:
In further sections the Guide help to understand how videos can be made accessible (shared) and brings clarity to tricky issues such as the different formats and copyright.
Worth a visit!!
Activists Guide to Archiving Video | archiveguide.witness.org.
Twelve months ago, Witness and its partners at Storyful launched the first dedicated space on YouTube for verified citizen video on human rights issues. Read the rest of this entry »