Posts Tagged ‘ISIS’

Algorithms designed to suppress ISIS content, may also suppress evidence of human rights violations

April 11, 2020

Facebook and YouTube designed algorithms to suppress ISIS content. They're having unexpected side effects.

Illustration by Leo Acadia for TIME
TIME of 11 April 2020 carries a long article by Billy Perrigo entitled “These Tech Companies Managed to Eradicate ISIS Content. But They’re Also Erasing Crucial Evidence of War Crimes” It is a very interseting piece that clearly spells out the dilemma of supressing too much or too little on Facebook, YouTube, etc.  Algorithms designed to suppress ISIS content, are having unexpected side effects such as suppressing evidence of human rights violations.
…..Images by citizen journalist Abo Liath Aljazarawy to his Facebook page (Eye on Alhasakah’s) showed the ground reality of the Syrian civil war. His page was banned. Facebook confirmed to TIME that Eye on Alhasakah was flagged in late 2019 by its algorithms, as well as users, for sharing “extremist content.” It was then funneled to a human moderator, who decided to remove it. After being notified by TIME, Facebook restored the page in early February, some 12 weeks later, saying the moderator had made a mistake. (Facebook declined to say which specific videos were wrongly flagged, except that there were several.)The algorithms were developed largely in reaction to ISIS, who shocked the world in 2014 when they began to share slickly-produced online videos of executions and battles as propaganda. Because of the very real way these videos radicalized viewers, the U.S.-led coalition in Iraq and Syria worked overtime to suppress them, and enlisted social networks to help. Quickly, the companies discovered that there was too much content for even a huge team of humans to deal with. (More than 500 hours of video are uploaded to YouTube every minute.) So, since 2017, beg have been using algorithms to automatically detect extremist content. Early on, those algorithms were crude, and only supplemented the human moderators’ work. But now, following three years of training, they are responsible for an overwhelming proportion of detections. Facebook now says more than 98% of content removed for violating its rules on extremism is flagged automatically. On YouTube, across the board, more than 20 million videos were taken down before receiving a single view in 2019. And as the coronavirus spread across the globe in early 2020, Facebook, YouTube and Twitter announced their algorithms would take on an even larger share of content moderation, with human moderators barred from taking sensitive material home with them.

But algorithms are notoriously worse than humans at understanding one crucial thing: context. Now, as Facebook and YouTube have come to rely on them more and more, even innocent photos and videos, especially from war zones, are being swept up and removed. Such content can serve a vital purpose for both civilians on the ground — for whom it provides vital real-time information — and human rights monitors far away. In 2017, for the first time ever, the International Criminal Court in the Netherlands issued a war-crimes indictment based on videos from Libya posted on social media. And as violence-detection algorithms have developed, conflict monitors are noticing an unexpected side effect, too: these algorithms could be removing evidence of war crimes from the Internet before anyone even knows it exists.

…..
It was an example of how even one mistaken takedown can make the work of human rights defenders more difficult. Yet this is happening on a wider scale: of the 1.7 million YouTube videos preserved by Syrian Archive, a Berlin-based non-profit that downloads evidence of human rights violations, 16% have been removed. A huge chunk were taken down in 2017, just as YouTube began using algorithms to flag violent and extremist content. And useful content is still being removed on a regular basis. “We’re still seeing that this is a problem,” says Jeff Deutsch, the lead researcher at Syrian Archive. “We’re not saying that all this content has to remain public forever. But it’s important that this content is archived, so it’s accessible to researchers, to human rights groups, to academics, to lawyers, for use in some kind of legal accountability.” (YouTube says it is working with Syrian Archive to improve how they identify and preserve footage that could be useful for human rights groups.)

…..

Facebook and YouTube’s detection systems work by using a technology called machine learning, by which colossal amounts of data (in this case, extremist images, videos, and their metadata) are fed to an artificial intelligence adept at spotting patterns. Early types of machine learning could be trained to identify images containing a house, or a car, or a human face. But since 2017, Facebook and YouTube have been feeding these algorithms content that moderators have flagged as extremist — training them to automatically identify beheadings, propaganda videos and other unsavory content.

Both Facebook and YouTube are notoriously secretive about what kind of content they’re using to train the algorithms responsible for much of this deletion. That means there’s no way for outside observers to know whether innocent content — like Eye on Alhasakah’s — has already been fed in as training data, which would compromise the algorithm’s decision-making. In the case of Eye on Alhasakah’s takedown, “Facebook said, ‘oops, we made a mistake,’” says Dia Kayyali, the Tech and Advocacy coordinator at Witness, a human rights group focused on helping people record digital evidence of abuses. “But what if they had used the page as training data? Then that mistake has been exponentially spread throughout their system, because it’s going to train the algorithm more, and then more of that similar content that was mistakenly taken down is going to get taken down. I think that is exactly what’s happening now.” Facebook and YouTube, however, both deny this is possible. Facebook says it regularly retrains its algorithms to avoid this happening. In a statement, YouTube said: “decisions made by human reviewers help to improve the accuracy of our automated flagging systems.”

…….
That’s because Facebook’s policies allow some types of violence and extremism but not others — meaning decisions on whether to take content down is often based on cultural context. Has a video of an execution been shared by its perpetrators to spread fear? Or by a citizen journalist to ensure the wider world sees a grave human rights violation? A moderator’s answer to those questions could mean that of two identical videos, one remains online and the other is taken down. “This technology can’t yet effectively handle everything that is against our rules,” Saltman said. “Many of the decisions we have to make are complex and involve decisions around intent and cultural nuance which still require human eye and judgement.”

In this balancing act, it’s Facebook’s army of human moderators — many of them outsourced contractors — who carry the pole. And sometimes, they lose their footing. After several of Eye on Alhasakah’s posts were flagged by algorithms and humans alike, a Facebook moderator wrongly decided the page should be banned entirely for sharing violent videos in order to praise them — a violation of Facebook’s rules on violence and extremism, which state that some content can remain online if it is newsworthy, but not if it encourages violence or valorizes terrorism. The nuance, Facebook representatives told TIME, is important for balancing freedom of speech with a safe environment for its users — and keeping Facebook on the right side of government regulations.

Facebook’s set of rules on the topic reads like a gory textbook on ethics: beheadings, decomposed bodies, throat-slitting and cannibalism are all classed as too graphic, and thus never allowed; neither is dismemberment — unless it’s being performed in a medical setting; nor burning people, unless they are practicing self-immolation as an act of political speech, which is protected. Moderators are given discretion, however, if violent content is clearly being shared to spread awareness of human rights abuses. “In these cases, depending on how graphic the content is, we may allow it, but we place a warning screen in front of the content and limit the visibility to people aged 18 or over,” said Saltman. “We know not everyone will agree with these policies and we respect that.”

But civilian journalists operating in the heat of a civil war don’t always have time to read the fine print. And conflict monitors say it’s not enough for Facebook and YouTube to make all the decisions themselves. “Like it or not, people are using these social media platforms as a place of permanent record,” says Woods. “The social media sites don’t get to choose what’s of value and importance.”

See also: https://humanrightsdefenders.blog/2019/06/17/social-media-councils-an-answer-to-problems-of-content-moderation-and-distribution/

https://time.com/5798001/facebook-youtube-algorithms-extremism/

Collecting human rights prize, Yazidi lawmaker calls Trump’s travel ban ‘unfair’

February 9, 2017

Iraqi lawmaker Vian Dakhil speaks after receiving the Lantos Human Rights Prize at a Capitol Hill ceremony on Feb. 8, 2017. RNS photo Adelle M. Banks

Iraqi lawmaker Vian Dakhil at the Lantos Human Rights Prize ceremony, 8 February  2017 – RNS photo

 

 

 

 

 

 

 

 

Last week I wrote about an award-winning human rights defender not being able to come and collect her award in the USA [https://humanrightsdefenders.blog/2017/02/01/yazidi-human-rights-laureate-may-be-banned-from-coming-to-washington-to-accept-award/].  Vian Dakhil made it to Washington in the end. She had already received a visa to come to Washington to accept an award from the Tom Lantos Foundation when President Donald Trump’s executive order pausing immigration from seven Muslim-majority countries, including Iraq, was issued. After an arduous process involving the State Department and the Iraqi Embassy, she was granted an exemption to the travel ban so she could attend the award ceremony on 8 February. Her sister and translator was able to get a visa after a federal judge temporarily halted the implementation of the executive order. Read the rest of this entry »

Syrian citizen-journalist Abdalaziz Alhamza’s talk at the 2016 Oslo Freedom Forum

June 8, 2016

Abdalaziz Alhamza and his team of citizen journalists risk their lives to smuggle video out of Syria to expose the shocking brutality of both the Assad regime and ISIS. Now ISIS has put a price on his head. Abdalaziz took the stage at the 2016 Oslo Freedom Forum of the Human Rights Foundation to talk about his work and his hopes for a brighter future in Syria.

Human Rights NGOs in UK under pressure from politicians and tabloids not to be ‘apologists’ for terrorism

March 3, 2015

It is not often that the Daily Mail, a British tabloid, writes about human rights defenders, but when it does [3 March 2015], it is vicious. Under the headline “No excuses! Theresa May leads politicians queuing up to blast British apologists for ISIS murderers“, it zooms in on Amnesty International and other NGOs that have worked on occasion with a local group called Cage. The latter is an islamic group led by a former Guantanamo Bay prisoner Moazzam Begg. The group’s research director, Asim Qureshi, recently described IS killer Mohammed Emwazi (“Jihadi John“) as a ‘beautiful young man’ and accused the security services of radicalising him.

This then led British politicians, from government and opposition, to outbid each other in the strongest possible terms to demand that everybody distance themselves from that group. E.g., Theresa May, the Home Secretary, said: ‘I condemn anyone who attempts to excuse that barbarism in the way that has been done by Cage.‘ Jacqui Smith, a Labour former Home Secretary, called Cage ‘outrageous apologists

Steve Crawshaw, of the office of the secretary general at Amnesty, admitted yesterday it was ‘highly unlikely’ they would work with Cage again, although together with Liberty, Justice and five other human rights groups, it had joined with Cage in a ‘collective’ to make representations to an inquiry into the treatment of British Army detainees.
Asked if Amnesty had played to a ‘myth’ of victimisation, Mr Crawshaw added: ‘I don’t think we have played to anybody’s myth. I can’t condemn strongly enough anybody, in any context who seeks to find some justification somehow for how they can justify killing civilians…Our colleagues there (in Iraq) are risking lives in order to document the terrible crimes of IS and therefore to hear somehow that we are turning away from those things, I do think is quite extraordinary.’

Amnesty International UK Director Kate Allen said yesterday: ‘Amnesty has no formal or financial relationship with Cage. Amnesty has, along with a number of others human rights organisations, worked on issues relating to Guantanamo and torture.’

Read more: No excuses! Theresa May leads politicians queuing up to blast British apologists for ISIS murderers | Daily Mail Online.

Kurdish Yazidi Woman Wins Anna Politkovskaya Award

October 11, 2014

More on awards: The winner of the 2014 Raw in War Anna Politkovskaya award is Kurdish Yazidi member of Iraq parliament Vian Dakhil .On Monday 6 October, RAW in WAR (Reach All Women in WAR) selected Vian Dakhil who has courageously spoken out and campaigned to protect the Yazidi people from the terror of Islamic State. She is the only ethnic Yazidi in the Iraqi Parliament and, despite being injured in a helicopter crash while delivering aid to survivors on Mt Sinjar, she continues to advocate and to mobilize support for her people, for the refugees and for those trapped in towns and villages under the regime of Islamic State. “I make no secret of the fact that I’m proud to be honored with your esteemed award, but the real way to honor someone is by protecting their freedom and rights. It is by bringing our prisoners back,’ said Dakhil in her speech while receiving the award.

Previous women human rights defenders who received this award: Malala Yousafzai 2013, Marie Colvin 2012, Razan Zaitouneh 2011, Dr. Halima Bashir 2010, Leila Alikarami on behalf of the One Million Signatures Campaign for Equality in Iran 2009, Malalai Joya 2008 and Natalia Estemirova 2007. See also: http://www.trueheroesfilms.org/thedigest/awards

via Kurdish Yazidi Woman Wins International Award | BAS NEWS.