On 15 December 2025 Emma Woollacott, in Forbes, referred to a new study that shows that 7 in 10 women human rights defenders, activists and journalists have experienced online violence in the course of their work. Produced through the UN Women’s ACT to End Violence against Women program and supported by the European Commission, “Tipping point: The chilling escalation of violence against women in the public sphere” draws on a global survey of women from 119 countries.
Along with online threats and harassment, more than 4 in 10 have experienced offline harm linked to online abuse — more than twice as many as in 2020, the researchers found. This can range from verbal harassment right up to physical assault, stalking and swatting.
“These figures confirm that digital violence is not virtual — it’s real violence with real-world consequences,” said Sarah Hendricks, director of policy, programme and intergovernmental division at UN Women.
“Women who speak up for our human rights, report the news or lead social movements are being targeted with abuse designed to shame, silence and push them out of public debate. Increasingly, those attacks do not stop at the screen — they end at women’s front doors. We cannot allow online spaces to become platforms for intimidation that silence women and undermine democracy.”
And AI is only making things worse, with almost 1 in 4 women human rights defenders, activists and journalists having experienced AI-assisted online violence, such as deepfake imagery and manipulated content. This is most often the case for writers and public communicators who focus on human rights issues, such as social media content creators and influencers, for whom the figure reaches 30%.
“Gender-based online violence is not a new phenomenon, but its scale certainly is,” said report co-author Lea Hellmueller, associate professor in journalism at City St George’s and associate dean for research in the School of Communication and Creativity.
“AI tools enable the production of cheaper and faster abusive content, which is detrimental to women in public life — and beyond,” Hellmueller added.
Tech firms are partly responsible, the researchers said, with the report calling for better tools to identify, monitor, report and fend off AI-assisted online violence. The researchers also want to see more legal and regulatory mechanisms to force tech firms to prevent their technologies being deployed against women in the public sphere.
“Our next steps include publishing data from the survey about the opportunities for, and barriers to, law enforcement and legal redress for survivors of online violence,” said Julie Posetti, chair of the Centre for Journalism and Democracy at City St George’s, University of London, one of the authors of the report. “We will also focus on creative efforts to counter gender-based online violence and policy recommendations to help hold the Big Tech facilitators of this dangerous phenomenon accountable.”
Safety Net funding is for individual women human rights defenders (WHRDs) from or working in conflict and crisis affected countries, who, due to their commitments to human rights and peace, currently face – or have faced risks – with resulting impacts that continue to threaten their safety and work. SAFETY NET DOES NOT FUND CIVIL SOCIETY ORGANIZATIONS OR PROGRAM IMPLEMENTATION.
The WPHF Window for WHRDs Safety Net aims to improve the security and protection of WHRDs by providing for, but not limited to:
Temporary relocation costs (e.g. accommodation, food and transportation)
Security and protection costs (e.g. secure transportation, digital or physical security training)
Equipment (e.g. mobile phone, computer, security system and cameras)
Self-care (e.g. physical or mental health support)
Legal assistance
Repatriation costs, to facilitate return and reintegration in home country
Safety Net grants are provided for amounts up to USD 10,000 (subject to revision by the NGO partners of the WPHF Window for WHRDs) to cover needs for a duration of up to six months.
Eligibility Criteria
Gender: women and those who identify as women. This includes lesbian, gay, transgender and intersex (LGBTI) human rights defenders.
Age: 18 years old and above.
Country of origin: from/working in conflict and crisis-affected contexts. *See for reference countries that might be eligible for support: List of matters of which the UN Security Council is seized: S/2023/10.
Human rights activities: provides details of peaceful engagement in the advancement of human rights, either individually or through a civil society organization.
Threats and risks:
Demonstrates current or past serious security risks for her and/or her dependents, because of her commitment to human rights and peace; AND/OR
Demonstrates that risks are – or have been such – that her ability to keep working on behalf of human rights and peace is threatened.
Grant and duration: the requested funding cannot exceed USD 10,000, or cover needs beyond an anticipated 6-month period.
Decision-making process
You can submit your application using one of the two below methods below
On 28 May 2024 ACCESS NOW published its overview of grants in 2023: Through our Access Now Grantsprogram — now in its ninth year — we provide flexible, grantee-driven financial support to the grassroots and frontline organizations confronting these threats. We do this because we believe the people most directly impacted by attacks on human rights — from Palestine to Myanmar to Ukraine and beyond — are best placed to define solutions and implement them. Below is an overview of our grant-making in 2023, including a deep dive into the humanitarian response to the Gaza crisis, which was sparked that year.
AN OVERVIEW
In 2023, Access Now Grants awarded a total of just under $1.7 million USD, fortifying our collective efforts to defend and extend digital rights. We provided 66 grants to 63 organizations and individuals leading digital rights efforts in about 30 countries. [for details, see: https://www.accessnow.org/digital-rights-grants/]
We strive to support those who need it most. Currently, Access Now Grants reserves nearly all of our funding for people and organizations in Global Majority countries. In 2023, we awarded the highest number of grants (20) in Eastern Europe and Central Asia, followed by Asia Pacific (15), Africa (13), the Middle East and North Africa (10), Latin America and the Caribbean (7), and one grantee that works on the global level but supports journalists and human rights organizations operating in countries experiencing armed conflict and crisis.
Notably, 71% of our 2023 grants supported efforts in countries that Freedom House has classified as “not free” in its Freedom in the World reporting. We also extended funding to organizations and communities we had not previously supported, including in Libya, Iraq, Palestine, Thailand, and Senegal. In addition, 25% of our grants focused on defending gender and sexuality rights and supported people who identify as women, non-binary, or LGBTQ+.
In addition to ensuring we reach people with the funding they need, we work to provide the kind of longer-term support that can help organizations build momentum. In 2023, 60% of the grantees that received core, project, and discretionary grants were receiving their third or more year of consecutive funding.
SPOTLIGHT ON GAZA : It is impossible to remark on any human rights efforts in 2023 without acknowledging the genocide now unfolding in Palestine. After the Hamas attack on Israel on October 7, 2023, we have seen digital threats play a devastating role in deepening the crisis in Gaza: from the Israeli military’s reported use of AI technology to bomb and kill Palestinians, often with their family; to internet shutdowns that restrict Gazans’ access to life-saving information and ability to communicate; to communication platforms’ censorship of Palestinians and pro-Palestinian voices; to the documented increase in hate speech and incitement to violence against Palestinians online….
As we continue our grant-making in the year ahead, in Palestine and around the world, we remain committed to human rights organizations and activists who are fighting for justice, security, and dignity for their communities and for all of us. Their collective work is more necessary and urgent than ever.
ACCESS NOW gives a list of the grants awarded in 2023. Some grants are not included for security reasons. Others must be listed anonymously.
Mozilla is highlighting each year the work of 25 digital leaders using technology to amplify voices, effect change, and build new technologies globally through its Rise 25 Awards. On 13 May 2024 was the turn of Raphael Mimoun, a builder dedicated to making tools that empower journalists and human rights defenders. Aron Yohannes talked with Raphael about the launch of his app, Tella, combatting misinformation online, the future of social media platforms and more.
Raphael Mimoun: So I never worked in tech per se and only developed a passion for technology as I was working in human rights. It was really a time when, basically, the power of technology to support movements and to head movements around the world was kind of getting fully understood. You had the Arab Spring, you had Occupy Wall Street, you had all of these movements for social justice, for democracy, for human rights, that were very much kind of spread through technology, right? Technology played a very, very important role. But just after that, it was kind of like a hangover where we all realized, “OK, it’s not just all good and fine.” You also have the flip side, which is government spying on the citizens, identifying citizens through social media, through hacking, and so on and so forth — harassing them, repressing them online, but translating into offline violence, repression, and so on. And so I think that was the moment where I was like, “OK, there is something that needs to be done around technology,” specifically for those people who are on the front lines because if we just treat it as a tool — one of those neutral tools — we end up getting very vulnerable to violence, and it can be from the state, it can also be from online mobs, armed groups, all sort of things.
There’s so much misinformation out there now that it’s so much harder to tell the difference between what’s real and fake news. Twitter was such a reliable tool of information before, but that’s changed. Do you think that any of these other platforms can be able to help make up for so much of the misinformation that is out there?
I think we all feel the weight of that loss of losing Twitter. Twitter was always a large corporation, partially owned by a billionaire. It was never kind of a community tool, but there was still an ethos, right? Like a philosophy, or the values of the platform were still very much like community-oriented, right? It was that place for activists and human rights defenders and journalists and communities in general to voice their opinions. So I think that loss was very hard on all of us.
I see a lot of misinformation on Instagram as well. There is very little moderation there. It’s also all visual, so if you want traction, you’re going to try to put something that is very spectacular that is very eye catchy, and so I think that leads to even more misinformation.
I am pretty optimistic about some of the alternatives that have popped up since Twitter’s downfall. Mastodon actually blew up after Twitter, but it’s much older — I think it’s 10 years old by now. And there’s Bluesky. So I think those two are building up, and they offer spaces that are much more decentralized with much more autonomy and agency to users. You are more likely to be able to customize your feeds. You are more likely to have tools for your own safety online, right? All of those different things that I feel like you could never get on Threads, on Instagram or on Twitter, or anything like that. I’m hoping it’s actually going to be able to recreate the community that is very much what Twitter was. It’s never going to be exactly the same thing, but I’m hoping we will get there. And I think the fact that it is decentralized, open source and with very much a philosophy of agency and autonomy is going to lead us to a place where these social networks can’t actually be taken over by a power hungry billionaire.
What do you think is the biggest challenge that we face in the world this year on and offline, and then how do you think we can combat it?
I don’t know if that’s the biggest challenge, but one of the really big challenges that we’re seeing is how the digital is meeting real life and how people who are active online or on the phone on the computer are getting repressed for that work in real life. So we developed an app called Tella, which encrypts and hides files on your phone, right? So you take a photo or a video of a demonstration or police violence, or whatever it is, and then if the police tries to catch you and grab your phone to delete it, they won’t be able to find it, or at least it will be much more difficult to find it. Or it would be uploaded already. And things like that, I think is one of the big things that we’re seeing again. I don’t know if that the biggest challenge online at the moment, but one of the big things we’re seeing is just that it’s becoming completely normalized to grab someone’s phone or check someone’s computer at the airport, or at the border, in the street and go through it without any form of accountability. People have no idea what the regulations are, what the rules are, what’s allowed, what’s not allowed. And when they abuse those powers, is there any recourse? Most places in the world, at least, where we are working, there is definitely no recourse. And so I think that connection between thinking you’re just taking a photo for social media but actually the repercussion is so real because you’re going to have someone take your phone, and maybe they’re going to delete the photo, or maybe they’re going to detain you. Or maybe they’re going to beat you up — like all of those different things. I think this is one of the big challenges that we’re seeing at the moment, and something that isn’t traditionally thought of as an internet issue or an online digital rights issue because it’s someone taking a physical device and looking through it. It often gets overlooked, and then we don’t have much kind of advocacy around it, or anything like that.
What do you think is one action everybody can take to make the world and our lives online a little bit better?
I think social media has a lot of negative consequences for everyone’s mental health and many other things, but for people who are active and who want to be active, consider social networks that are open source, privacy-friendly and decentralized. Bluesky, the Fediverse —including Mastodon — are examples because I think it’s our responsibility to kind of build up a community there, so we can move away from those social media platforms that are owned by either billionaires or massive corporations, who only want to extract value from us and who spy on us and who censor us. And I feel like if everyone committed to being active on those social media platforms — one way of doing that is just having an account, and whatever you post on one, you just post on the other — I feel like that’s one thing that can make a big difference in the long run.
We started Rise25 to celebrate Mozilla’s 25th anniversary. What do you hope that people are celebrating in the next 25 years?
I was talking a little bit earlier about how we are building a culture that is more privacy-centric, like people are becoming aware, becoming wary about all these things happening to the data, the identity, and so on. And I do think we are at a turning point in terms of the technology that’s available to us, the practices and what we need as users to maintain our privacy and our security. I feel like in honestly not even 25, I think in 10 years, if things go well — which it’s hard to know in this field — and if we keep on building what we already are building, I can see how we will have an internet that is a lot more privacy-centric where communications are by default are private. Where end-to-end encryption is ubiquitous in our communication, in our emailing. Where social media isn’t extractive and people have actual ownership and agency in the social network networks they use. Where data mining is no longer a thing. I feel like overall, I can see how the infrastructure is now getting built, and that in 10,15 or 25 years, we will be in a place where we can use the internet without having to constantly watch over our shoulder to see if someone is spying on us or seeing who has access and all of those things.
Lastly, what gives you hope about the future of our world?
That people are not getting complacent and that it is always people who are standing up to fight back. We’re seeing it at. We saw it at Google with people standing up as part of No Tech for Apartheid coalition and people losing the jobs. We’re seeing it on university campuses around the country. We’re seeing it on the streets. People fight back. That’s where any change has ever come from: the bottom up. I think now, more than ever, people are willing to put something on the line to make sure that they defend their rights. So I think that really gives me hope.
Nikole Yanez is a computer scientist by training, and a human rights defender from Honduras. She is passionate about feminism, the impact of the internet and protecting activists. She was first drawn to human rights through her work as a reporter with a local community radio station. After surviving the coup d’état in Honduras in 2009, Nikole broadened her approach to focus her activism on technology. When she applied for the Digital Forensics Fellowship with the Amnesty Tech Security Lab in 2022, she was looking to learn more about cybersecurity and apply what she learnt with the organizations and collectives she works with regularly.
She highlighted her commitment to fostering a network of tech-savvy communities across Latin America in an interview with Elina Castillo, Amnesty Tech’s Advocacy and Policy Advisor:
I grew up in Honduras, where I lived through the coup d’état, which took place in 2009. It was a difficult time where rights were non-existent, and people were constantly afraid. I thought it was something you only read about in history books, but it was happening in front of my eyes. I felt myself just trying to survive, but as time went by it made me stronger and want to fight for justice. Despite the difficulties, people in my community remained hopeful and we created a community radio station, which broadcast stories about everyday people and their lives with the aim of informing people about their human rights. I was a reporter, developing stories about individual people and their fight for their rights. From there, I found a passion for working with technology and it inspired me to train to become a computer scientist.
I am always looking for ways to connect technology with activism, and specifically to support women and Indigenous people in their struggles. As much as technology presents risks for human rights defenders, it also offers opportunities for us to better protect ourselves and strengthen our movements. Technology can bring more visibility to our movements, and it can empower our work by allowing us to connect with other people and learn new strategies.
Is there one moment where you realized how to connect what you’ve been doing with feminism with technology?
In my work, my perspective as a feminist helps me centre the experiences and needs of marginalised people for trainings and outreach. It is important for me to publicly identify as an Afrofeminist in a society where there is impunity for gendered and racist violence that occurs every day. In Honduras we need to put our energy into supporting these communities whose rights are most violated, and whose stories are invisible.
For example, in 2006, I was working with a Union to install the Ubuntu operating system (an open-source operating system) on their computers. We realized that the unionists didn’t know how to use a computer, so we created a space for digital literacy and learning about how to use a computer at the same time. This became not just a teaching exercise, but an exercise for me to figure out how to connect these tools to what people are interested in. Something clicked for me in this moment, and this experience helped solidify my approach to working on technology and human rights.
There are not many women working in technology and human rights. I don’t want to be one of the only women, so my goal is to see more women colleagues working on technical issues. I want to make it possible for women to work in this field. I also want to motivate more women to create change within the intersection of technology and human rights. Using a feminist perspective and approach, we ask big questions about how we are doing the work, what our approach needs to be, and who we need to work with. Nikole Yanez Honduras Human Rights Defender
For me, building a feminist internet means building an internet for everyone. This means creating a space where we do not reproduce sexist violence, where we find a community that responds to the people, to the groups, and to the organizations that fight for human rights. This includes involving women and marginalised people in building the infrastructure, in the configuration of servers, and in the development of protocols for how we use all these tools.
In Honduras, there aren’t many people trained in digital forensics analysis, yet there are organizations that are always seeking me out to help check their phones. The fellowship helped me learn about forensic analysis on phones and computers and tied the learning to what I’m actually doing in my area with different organizations and women’s rights defenders. The fellowship was practical and rooted in the experience of civil society organizations.
How do you explain the importance of digital forensics? Well first, it’s incredibly relevant for women rights defenders. Everyone wants to know if their phone has been hacked. That’s the first thing they ask:, “Can you actually know whether your phone has been hacked?” and “How do I know? Can you do it for me? How?” Those are the things that come up in my trainings and conversations.
I like to help people to think about protection as a process, something ongoing, because we use technology all day long. There are organizations and people that take years to understand that. So, it’s not something that can be achieved in a single conversation. Sometimes a lot of things need to happen, including bad things, before people really take this topic seriously…
I try to use very basic tools when I’m doing digital security support, to say you can do this on whatever device you’re on, this is a prevention tool. It’s not just applying technical knowledge, it’s also a process of explaining, training, showing how this work is not just for hackers or people who know a lot about computers.
One of the challenges is to spread awareness about cybersecurity among Indigenous and grassroots organizations, which aren’t hyper-connected and don’t think that digital forensics work is relevant to them. Sometimes what we do is completely disconnected from their lives, and they ask us: “But what are you doing?” So, our job is to understand their questions and where they are coming from and ground our knowledge-sharing in what people are actually doing.
To someone reading this piece and saying, oh, this kind of resonates with me, where do I start, what would your recommendation be?
If you are a human rights defender, I would recommend that you share your knowledge with your collective. You can teach them the importance of knowing about them, practicing them, as well as encouraging training to prevent digital attacks, because, in the end, forensic analysis is a reaction to something that has happened.
We can take a lot of preventive measures to ensure the smallest possible impact. That’s the best way to start. And it’s crucial to stay informed, to keep reading, to stay up to date with the news and build community.
If there are girls or gender non-conforming people reading this who are interested in technical issues, it doesn’t matter if you don’t have a degree or a formal education, as long as you like it. Most hackers I’ve met become hackers because they dive into a subject, they like it and they’re passionate about it.Nikole Yanez Honduras Human Rights Defender.
Last year, the Helpline received a total of 3,709 requests for digital security assistance.
To put this in perspective, the Helpline received 10,000 requests in total between 2014 and 2021, but more than double that number in the three years that followed.
Most (82%) of the cases we dealt with in 2023 were reactive in nature, meaning they related to unfolding incidents or emergencies that required beneficiaries to take rapid measures to strengthen their digital security. The remaining 18% were preventative, whereby beneficiaries preemptively sought out digital security advice, tools, and solutions.
In recent years, the Helpline has been investing in our ability to operate 24/7, 365 days a year, and to deliver more substantial and engaged forms of support. For instance, we’ve been conducting analysis of advanced threats and producing collaborative research in places such as Armenia, Serbia, and Jordan.
In 2023, the regional distribution of cases was as follows:
Our work supports a wide spectrum of civil society stakeholders; from individual activists, human rights defenders, and members of marginalized communities, to journalists and media workers. For it to be as impactful as possible, we work closely with the wider digital security community, through networks like CiviCERT. This allows us to deliver adequate support to each of our beneficiary groups, which were distributed as follows in 2023:
In 2024, the Helpline will continue improving and increasing how we collaborate with activist groups around the world, as well as working to meet the evolving needs of the global rapid response community. If you are a member of civil society in need of digital security assistance, you can find details about how to get in touch on our website page.
On 12 March 2024 the U.S. and European Union issued new joint guidance on Monday for online platforms to help mitigate virtual attacks targeting human rights defenders, reports Alexandra Kelley, Staff Correspondent, Nextgov/FCW.
Outlined in 10 steps, the guidance was formed following stakeholder consulting from January 2023 to February 2024. Entities including nongovernmental organizations, trade unionists, journalists, lawyers, environmental and land activists advised both governments on how to protect human rights defenders on the internet.
Recommendations within the guidance include: committing to an HRD [human rights defender] protection policy; identifying risks to HRDs; sharing information with peers and select stakeholders; creating policy to monitoring performance metric base marks; resource staff adequately; build a capacity to address local risks; offer safety tools education; create an incident reporting channel; provide access to help for HRDs; and incorporate a strong transparent infrastructure.
Digital threats HRDs face include target Internet shutdowns, censorship, malicious cyber activity, unlawful surveillance, and doxxing. Given the severity and reported increase of digital attacks against HRDs, the guidance calls upon online platforms to take mitigating measures.
“The United States and the European Union encourage online platforms to use these recommendations to determine and implement concrete steps to identify and mitigate risks to HRDs on or through their services or products,” the guidance reads.
The ten guiding points laid out in the document reflect existing transatlantic policy commitments, including the Declaration for the Future of the Internet. Like other digital guidance, however, these actions are voluntary.
“These recommendations may be followed by further actions taken by the United States or the European Union to promote rights-respecting approaches by online platforms to address the needs of HRDs,” the document said
In preparation for what may be the final days of the trial of Ola Bini, an open source and free software developer arrested shortly after Julian Assange’s ejection from Ecuador’s London Embassy, civil society organizations observing the case have issued a report citing due process violations, technical weaknesses, political pressures, and risks that this criminal prosecution entails for the protection of digital rights. Bini was initially detained three years ago and previous stages of his prosecution had significant delays that were criticized by the Office of the Inter-American Commission on Human Rights (IACHR) Special Rapporteur for Freedom of Expression. An online press conference is scheduled for May 11th, with EFF and other organizations set to speak on the violations in Bini’s prosecution and the danger this case represents. The trial hearing is set for May 16-20, and will most likely conclude next week. If convicted, Bini’s defense can still appeal the decision.
What’s Happened So Far
The first part of the trial against Ola Bini took place in January. In this first stage of testimony and expert evidence, the court repeatedly called attention to various irregularities and violations to due process by the prosecutor in charge. Human rights groups observing the hearing emphasized the flimsy evidence provided against Bini and serious flaws in how the seizure of his devices took place. Bini’s defense stressed that the raid happened without him present, and that seized encrypted devices were examined without following procedural rules and safeguards.
These are not the only problems with the case. Over two years ago, EFF visited Ecuador on a fact-finding mission after Bini’s initial arrest and detention. What we found was a case deeply intertwined with the political effects of its outcome, fraught with due process violations. EFF’s conclusions from our Ecuador mission were that political actors, including the prosecution, have recklessly tied their reputations to a case with controversial or no real evidence.
Ola Bini is known globally as someone who builds secure tools and contributes to free software projects. Bini’s team at ThoughtWorks contributed to Certbot, the EFF-managed tool that has provided strong encryption for millions of websites around the world, and most recently, Bini co-founded a non-profit organization devoted to creating user-friendly security tools.
What Bini is not known for, however, is conducting the kind of security research that could be mistaken for an “assault on the integrity of computer systems,” the crime for which he was initially investigated, or “unauthorized access to a computer system,” the crime for which he is being accused now (after prosecutors changed the charges). In 2019, Bini’s lawyers counted 65 violations of due process, and journalists told us at the time that no one was able to provide them with concrete descriptions of what he had done. Bini’s initial imprisonment was ended after a decision considered his detention illegal, but the investigation continued. The judge was later “separated” from the case in a ruling that admitted the wrongdoing of successive pre-trial suspensions and the violation of due process.
A so-called piece of evidence against Bini was a photo of a screenshot, supposedly taken by Bini himself and sent to a colleague, showing the telnet login screen of a router. The image is consistent with someone who connects to an open telnet service, receives a warning not to log on without authorization, and does not proceed—respecting the warning. As for the portion of a message exchange attributed to Bini and a colleague, leaked with the photo, it shows their concern with the router being insecurely open to telnet access on the wider Internet, with no firewall.
Between the trial hearing in January and its resumption in May, Ecuador’s Prosecutor’s Office revived an investigation against Fabián Hurtado, the technical expert called by Ola Bini’s defense to refute the image of the telnet session and who is expected to testify at the trial hearing.
On January 10, 2022, the Prosecutor’s Office filed charges for procedural fraud against Hurtado. There was a conspicuous gap between this charge and the last investigative proceeding by prosecutors in the case against Hurtado, when police raided his home almost 20 months before, claiming that he had “incorporated misleading information in his résumé”. This raid was violent and irregular, and considered by Amnesty International as an attempt to intimidate Ola Bini’s defense. One of the pieces of evidence against Hurtado is the document by which Bini’s lawyer, Dr. Carlos Soria, included Hurtado’s technical report in Bini’s case file.
Hurtado’s indictment hearing was held on February 9, 2022. The judge opened a 90-day period of investigation which is about to end. As part of this investigation, the prosecutor’s office and the police raided the offices of Ola Bini’s non-profit organization in a new episode of due process violations, according to media reports.
Civil Society Report and Recommendations
Today’s report, by organizations gathered in the Observation Mission of Bini’s case, is critical for all participating and to others concerned about digital rights around the world. There is still time for the court to recognize and correct the irregularities and technical weaknesses in the case. It points out key points that should be taken into consideration by the judicial authorities in charge of examining the case.
In particular, the report notes, the accusations have failed to demonstrate a consistent case against Ola Bini. Irregularities in court procedures and police action have affected both the speed of the procedure and due process of law in general. In addition, accusations against Bini show little technical knowledge, and could lead to the criminalization of people carrying out legitimate activities protected by international human rights standards. This case may lead to the further persecution of the so-called “infosec community” in Latin America, which is made up primarily of security activists who find vulnerabilities in computer systems, carrying out work that has a positive impact on society in general. The attempt to criminalize Ola Bini already shows a hostile scenario for these activists and, consequently, for the safeguard of our rights in the digital environment.
Moreover, these activists must be guaranteed the right to use the tools necessary for their work—for example, the importance of online anonymity must be respected as a premise for the exercise of several human rights, such as privacy and freedom of expression. This right is protected by international Human Rights standards, which recognize the use of encryption (including tools such as Tor) as fundamental for the exercise of these rights.
These researchers and activists protect the computer systems on which we all depend, and protect the people who have incorporated electronic devices into their daily lives, such as human rights defenders, journalists and activists, among many other key actors for democratic vitality. Ola Bini, and others who work in the field, must be protected—not persecuted.
EngageMedia posted on 28 February 2022 an anthology of films which highlight Myanmar’s long struggle for democracy
This movie playlist is from Cinemata, a platform for social and environmental films about the Asia-Pacific. It is a project of EngageMedia, a nonprofit that promotes digital rights, open and secure technology, and social issue documentary. This is edited and republished as part of a content-sharing agreement with Global Voices.
EngageMedia has curated a playlist of films that shows the extent of rights abuses in the country, as well as courageous forms of resistance against the continuing infringement on people’s rights. Marking the one-year anniversary of the coup, “A Year of Resistance” turns the spotlight on the long-standing struggle of the people of Myanmar for democracy.
This film collection is curated in solidarity with the people of Myanmar. In bringing the stories of unrest and atrocities to light, these films hope to inspire action and advocacy towards justice and freedom.
“Burma Rebel Artist: Moe Thandar Aung”
After the Myanmar military coup in February 2021, Moe Thandar Aung, a graphic designer whose work touched on themes on feminism, began making protest art in support of calls to defend and uphold democracy in the country.
“Black out”
In the aftermath of the 2021 Myanmar coup, the country is faced with state-mandated internet and information blackouts. Hnin, a single mother, and Mon, her daughter and an anti-coup protester, are among those who can no longer access the internet at home. In their pursuit of news on what is happening on the ground, they find only fabricated stories and unreliable information.
During the six months of the junta coup, at least 950 civilians have been violently killed. A total of 90 children under the age of 18 have been murdered, while at least 48 children were arrested.
An independent female humanitarian activist from Shan State describes the trauma she experiences in working in an environment pervaded by despair but also her commitment to helping those forced to flee armed conflict. This film was directed by Sai Naw Kham, Mon Mon Thet Khin, and Soe Yu Maw.
In this video, Myanmar activists talk about the digital rights and digital security challenges they face, arguing that freedom of expression, freedom to organize, and freedom to associate should be kept, protected elements of digital rights.
This song was made by 24 Youth from six different corners from Myanmar that participated in Turning Tables Myanmar’s yearlong social cohesion project “The Voice of the Youth.” Together they produced and recorded the song “Wake Up” which calls for democracy, youth participation, and sustainable development to replace corruption and injustice.
This 2009 film shows powerful footage from the Saffron Revolution, a series of economic and political protests led by students and Buddhist monks that swept Myanmar from August to September 2007. It also highlights the continuing need for international solidarity amongst Southeast Asians in times of political upheavals as in the current situation in Myanmar.
Their key point is worth noting: The problem for human rights defenders in the Gulf region and neighbouring countries is that states have exploited the opportunity to align their cybercrime laws with European standards to double-down on laws restricting legitimate online expression BUT without any of the judicial safeguards that exist in that region.
Several women take part in a protest, using a hashtag, against Saudi Crown Prince Mohamed bin Salman’s visit to the country in Tunis, Tunisia, in November 2018. EFE / Stringer
Governments in every region of the world are criminalizing human rights activism. They do it by prosecuting protest organizers, journalists, internet activists, and leaders of civil society organizations under laws that make it a crime to insult public figures, disseminate information that damages “public order,” “national security,” and “fake news.”
In the Gulf region and neighbouring countries, oppressive governments have further weaponized their legal arsenal by adopting anti-cybercrime laws that apply these overly broad and ill-defined offline restrictions to online communications.
In an age when online communications are ubiquitous, and in societies where free press is crippled, laws that criminalize the promotion of human rights on social media networks and other online platforms undermine the ability to publicize and discuss human rights violations and threaten the foundation of any human rights movement.
In May of 2018, for example, the Saudi government carried out mass arrests of women advocating online for women’s right to drive. Charged under the country’s cybercrime law including article six which prohibits online communication “impinging on public order, religious values, public morals, and privacy,” these human rights activists were detained, tortured, and received multi-year sentences for the “crime” of promoting women’s rights.
There is certainly a necessity to address the prevalence and impact of cybercrimes but without criminalizing people who speak out for human rights.
European countries and the United Nations (UN) have encouraged states to adopt a standard approach to addressing crimes committed with online technologies ranging from wire fraud to financing terrorist groups. The Council of Europe issued a 2001 regional convention on cybercrime, to which any state may accede, and the UN is promoting a cybercrime treaty.
Common standards can prevent the abuse of online technologies by enabling the sharing of online evidence and promoting accountability since the evidence of online crimes often resides on servers outside the country where the harm occurred or where the wrongdoers reside.
The problem for human rights defenders in the Gulf region and neighbouring countries is that states have exploited the opportunity to align their cybercrime laws with European standards to double-down on laws restricting legitimate online expression.
European countries have robust human rights oversight from the European Court of Human Rights, which ensures that limitations on freedom of expression online meet stringent international standards. There is no comparable human rights oversight for the Gulf region. Without adequate international judicial review, governments can successfully exploit international processes to strengthen their ability to stifle online expression.
The regional model cybercrime law drafted by the United Arab Emirates and adopted by the Arab League in 2004, follows international guidance. However, it incorporates a regional twist and includes provisions that criminalize online dissemination of content that is “contrary to the public order and morals,” facilitates assistance to terrorist groups, along with disclosure of confidential government information related to national security or the economy.
UN experts reviewed the UAE law and gave it a seal of approval, noting it complied with the European convention, ignoring the fact that UN human rights experts have documented repeatedly that governments use such restrictions to crack down on dissent. A UN-sponsored global cybercrime study, published in 2013, similarly soft-pedaled the threat of criminalizing online dissent by noting that governments had leeway to protect local values. Such protection does not extend to speaking up for universal rights like equality and democracy.
Actually, the universal right to freedom of expression protects online content, and limitations must meet international standards of legality, legitimacy, necessity, and proportionality. In our recent report on the use of anti-cybercrime legislation throughout the Gulf region and neighbouring countries, we found that over an 18-month period (May 2018-October 2020), there were 225 credible incidents of online freedom of expression violations against activists and journalist in ten countries: Bahrain, Iran, Iraq, Jordan, Kuwait, Oman, Qatar, Saudi Arabia, Syria, and the UAE. Each country has adopted anti-cybercrime laws except Iraq, where lawmakers’ drafts of proposed legislation have been met with stiff opposition from domestic and international human rights groups.
The international community needs to increase pressure on the Gulf region and neighboring countries to comply with their international obligations to protect freedom of expression off and online. Turning away from the clear evidence that oppressive governments are expanding the reach of criminal law to stifle online human rights activism undermines legitimate international efforts to address cybercrime.
How can we trust the UN to safeguard the voices advocating online for human rights and democracy in a region that so desperately needs both, if it fails to insist human rights safeguards be written into the regional and national cybercrime laws it champions?
In the age of the internet, online human rights activism needs to be supported—and protected—as a vital part of the cybercommunications ecosystem. In the Gulf region, defenders of human rights pay an untenable price for their work, risking arrest, torture, and even death. It is time to reverse the trend while there are still defenders left.
One of the women human rights defenders in Saudi Arabia said before she was imprisoned, “If the repressive authorities here put behind bars every peaceful voice calling for respect for public freedoms and the achievement of social justice in the Gulf region and neighboring countries, only terrorists will remain out.” History has proven the truth of her words, as most of the individuals who led terrorist groups with a global reach have come from this region and have caused, and still cause, chronic problems for the whole world.
The important lesson that we must learn here is that repressive governments foster a destructive dynamic of expansion and intensification of human rights violations. Repressive governments cooperate with and look to one another for strategies and tactics. Further troubling is that what we see in the Gulf region is enabled by the essentially unconditional support provided by some Western governments, especially the US and UK. This toxic template of Western support to governments that oppress their own people constitutes a threat to world peace and prosperity and must be addressed.
On 3 July 2021, a new interactive online platform by Forensic Architecture, supported by Amnesty International and the Citizen Lab, maps for the first time the global spread of the notorious spyware Pegasus, made by cyber-surveillance company NSO Group.
‘Digital Violence: How the NSO Group Enables State Terror’ documents digital attacks against human rights defenders around the world, and shows the connections between the ‘digital violence’ of Pegasus spyware and the real-world harms lawyers, activists, and other civil society figures face. NSO Group is the worst of the worst in selling digital burglary tools to players who they are fully aware actively and aggressively violate the human rights of dissidents, opposition figures, and journalists. Edward Snowden, President of Freedom of the Press Foundation.
NSO Group is a major player in the shadowy surveillance industry. The company’s Pegasus spyware has been used in some of the most insidious digital attacks on human rights defenders. When Pegasus is surreptitiously installed on a person’s phone, an attacker has complete access to a phone’s messages, emails, media, microphone, camera, calls and contacts. For my earlier posts on NSO see: https://humanrightsdefenders.blog/tag/nso-group/
“The investigation reveals the extent to which the digital domain we inhabit has become the new frontier of human rights violations, a site of state surveillance and intimidation that enables physical violations in real space,” said Shourideh C. Molavi, Forensic Architecture’s Researcher-in-Charge.
Edward Snowden narrates an accompanying video series which tell the stories of human rights activists and journalists targeted by Pegasus. The interactive platform also includes sound design by composer Brian Eno. A film about the project by award-winning director Laura Poitras will premiere at the 2021 Cannes Film Festival later this month.
The online platform is one of the most comprehensive databases on NSO-related activities, with information about export licenses, alleged purchases, digital infections, and the physical targeting of activists after being targeted with spyware, including intimidation, harassment, and detention. The platform also sheds light on the complex corporate structure of NSO Group, based on new research by Amnesty International and partners.
“For years, NSO Group has shrouded its operations in secrecy and profited from working in the shadows. This platform brings to light the important connections between the use of its spyware and the devastating human rights abuses inflicted upon activists and civil society,” said Danna Ingleton, Deputy Director of Amnesty Tech.
Amnesty International’s Security Lab and Citizen Lab have repeatedly exposed the use of NSO Group’s Pegasus spyware to target hundreds of human rights defenders across the globe. Amnesty International is calling on NSO Group to urgently take steps to ensure that it does not cause or contribute to human rights abuses, and to respond when they do occur. The cyber-surveillance must carry out adequate human rights due diligence and take steps to ensure that human rights defenders and journalists do not continue to become targets of unlawful surveillance.
In October 2019, Amnesty International revealed that Moroccan academic and activist, Maati Monjib’s phone had been infected with Pegasus spyware. He continues to face harassment by the Moroccan authorities for his human rights work. In December 2020, Maati Monjib was arbitrarily detained before being released on parole on 23 March 2021.
Maati Monjib, tells his story in one of the short films, and spoke of the personal toll following the surveillance, “The authorities knew everything I said. I was in danger. Surveillance is very harming for the psychological wellbeing of the victim. My life has changed a lot because of all these pressures.”
Amnesty International is calling for all charges against Maati to be dropped, and the harassment against him and his family by the Moroccan authorities to end.