On Human Rights Day, HURIDOCS hosted a webinar to showcase the critical role of documentation and technology in protecting defenders, advocating for the rights of those who are wrongfully detained and supporting those who are detained to claim their rights. This webinar featured four initiatives that recently collaborated with HURIDOCS to safeguard those who champion human rights:
The Observatory for Human Rights Defenders in Chiapas
SOS-Defenders
Detention Landscapes
Papuans Behind Bars
Welcoming remarks: Danna Ingleton, Executive Director, HURIDOCS Speakers:
Karla Jiménez Montoya, Movilidades Libres y Elegidas (CoLibres)
Giuseppe Scirocco, World Organisation Against Torture (OMCT)
Mozilla is highlighting each year the work of 25 digital leaders using technology to amplify voices, effect change, and build new technologies globally through its Rise 25 Awards. On 13 May 2024 was the turn of Raphael Mimoun, a builder dedicated to making tools that empower journalists and human rights defenders. Aron Yohannes talked with Raphael about the launch of his app, Tella, combatting misinformation online, the future of social media platforms and more.
Raphael Mimoun: So I never worked in tech per se and only developed a passion for technology as I was working in human rights. It was really a time when, basically, the power of technology to support movements and to head movements around the world was kind of getting fully understood. You had the Arab Spring, you had Occupy Wall Street, you had all of these movements for social justice, for democracy, for human rights, that were very much kind of spread through technology, right? Technology played a very, very important role. But just after that, it was kind of like a hangover where we all realized, “OK, it’s not just all good and fine.” You also have the flip side, which is government spying on the citizens, identifying citizens through social media, through hacking, and so on and so forth — harassing them, repressing them online, but translating into offline violence, repression, and so on. And so I think that was the moment where I was like, “OK, there is something that needs to be done around technology,” specifically for those people who are on the front lines because if we just treat it as a tool — one of those neutral tools — we end up getting very vulnerable to violence, and it can be from the state, it can also be from online mobs, armed groups, all sort of things.
There’s so much misinformation out there now that it’s so much harder to tell the difference between what’s real and fake news. Twitter was such a reliable tool of information before, but that’s changed. Do you think that any of these other platforms can be able to help make up for so much of the misinformation that is out there?
I think we all feel the weight of that loss of losing Twitter. Twitter was always a large corporation, partially owned by a billionaire. It was never kind of a community tool, but there was still an ethos, right? Like a philosophy, or the values of the platform were still very much like community-oriented, right? It was that place for activists and human rights defenders and journalists and communities in general to voice their opinions. So I think that loss was very hard on all of us.
I see a lot of misinformation on Instagram as well. There is very little moderation there. It’s also all visual, so if you want traction, you’re going to try to put something that is very spectacular that is very eye catchy, and so I think that leads to even more misinformation.
I am pretty optimistic about some of the alternatives that have popped up since Twitter’s downfall. Mastodon actually blew up after Twitter, but it’s much older — I think it’s 10 years old by now. And there’s Bluesky. So I think those two are building up, and they offer spaces that are much more decentralized with much more autonomy and agency to users. You are more likely to be able to customize your feeds. You are more likely to have tools for your own safety online, right? All of those different things that I feel like you could never get on Threads, on Instagram or on Twitter, or anything like that. I’m hoping it’s actually going to be able to recreate the community that is very much what Twitter was. It’s never going to be exactly the same thing, but I’m hoping we will get there. And I think the fact that it is decentralized, open source and with very much a philosophy of agency and autonomy is going to lead us to a place where these social networks can’t actually be taken over by a power hungry billionaire.
What do you think is the biggest challenge that we face in the world this year on and offline, and then how do you think we can combat it?
I don’t know if that’s the biggest challenge, but one of the really big challenges that we’re seeing is how the digital is meeting real life and how people who are active online or on the phone on the computer are getting repressed for that work in real life. So we developed an app called Tella, which encrypts and hides files on your phone, right? So you take a photo or a video of a demonstration or police violence, or whatever it is, and then if the police tries to catch you and grab your phone to delete it, they won’t be able to find it, or at least it will be much more difficult to find it. Or it would be uploaded already. And things like that, I think is one of the big things that we’re seeing again. I don’t know if that the biggest challenge online at the moment, but one of the big things we’re seeing is just that it’s becoming completely normalized to grab someone’s phone or check someone’s computer at the airport, or at the border, in the street and go through it without any form of accountability. People have no idea what the regulations are, what the rules are, what’s allowed, what’s not allowed. And when they abuse those powers, is there any recourse? Most places in the world, at least, where we are working, there is definitely no recourse. And so I think that connection between thinking you’re just taking a photo for social media but actually the repercussion is so real because you’re going to have someone take your phone, and maybe they’re going to delete the photo, or maybe they’re going to detain you. Or maybe they’re going to beat you up — like all of those different things. I think this is one of the big challenges that we’re seeing at the moment, and something that isn’t traditionally thought of as an internet issue or an online digital rights issue because it’s someone taking a physical device and looking through it. It often gets overlooked, and then we don’t have much kind of advocacy around it, or anything like that.
What do you think is one action everybody can take to make the world and our lives online a little bit better?
I think social media has a lot of negative consequences for everyone’s mental health and many other things, but for people who are active and who want to be active, consider social networks that are open source, privacy-friendly and decentralized. Bluesky, the Fediverse —including Mastodon — are examples because I think it’s our responsibility to kind of build up a community there, so we can move away from those social media platforms that are owned by either billionaires or massive corporations, who only want to extract value from us and who spy on us and who censor us. And I feel like if everyone committed to being active on those social media platforms — one way of doing that is just having an account, and whatever you post on one, you just post on the other — I feel like that’s one thing that can make a big difference in the long run.
We started Rise25 to celebrate Mozilla’s 25th anniversary. What do you hope that people are celebrating in the next 25 years?
I was talking a little bit earlier about how we are building a culture that is more privacy-centric, like people are becoming aware, becoming wary about all these things happening to the data, the identity, and so on. And I do think we are at a turning point in terms of the technology that’s available to us, the practices and what we need as users to maintain our privacy and our security. I feel like in honestly not even 25, I think in 10 years, if things go well — which it’s hard to know in this field — and if we keep on building what we already are building, I can see how we will have an internet that is a lot more privacy-centric where communications are by default are private. Where end-to-end encryption is ubiquitous in our communication, in our emailing. Where social media isn’t extractive and people have actual ownership and agency in the social network networks they use. Where data mining is no longer a thing. I feel like overall, I can see how the infrastructure is now getting built, and that in 10,15 or 25 years, we will be in a place where we can use the internet without having to constantly watch over our shoulder to see if someone is spying on us or seeing who has access and all of those things.
Lastly, what gives you hope about the future of our world?
That people are not getting complacent and that it is always people who are standing up to fight back. We’re seeing it at. We saw it at Google with people standing up as part of No Tech for Apartheid coalition and people losing the jobs. We’re seeing it on university campuses around the country. We’re seeing it on the streets. People fight back. That’s where any change has ever come from: the bottom up. I think now, more than ever, people are willing to put something on the line to make sure that they defend their rights. So I think that really gives me hope.
Nikole Yanez is a computer scientist by training, and a human rights defender from Honduras. She is passionate about feminism, the impact of the internet and protecting activists. She was first drawn to human rights through her work as a reporter with a local community radio station. After surviving the coup d’état in Honduras in 2009, Nikole broadened her approach to focus her activism on technology. When she applied for the Digital Forensics Fellowship with the Amnesty Tech Security Lab in 2022, she was looking to learn more about cybersecurity and apply what she learnt with the organizations and collectives she works with regularly.
She highlighted her commitment to fostering a network of tech-savvy communities across Latin America in an interview with Elina Castillo, Amnesty Tech’s Advocacy and Policy Advisor:
I grew up in Honduras, where I lived through the coup d’état, which took place in 2009. It was a difficult time where rights were non-existent, and people were constantly afraid. I thought it was something you only read about in history books, but it was happening in front of my eyes. I felt myself just trying to survive, but as time went by it made me stronger and want to fight for justice. Despite the difficulties, people in my community remained hopeful and we created a community radio station, which broadcast stories about everyday people and their lives with the aim of informing people about their human rights. I was a reporter, developing stories about individual people and their fight for their rights. From there, I found a passion for working with technology and it inspired me to train to become a computer scientist.
I am always looking for ways to connect technology with activism, and specifically to support women and Indigenous people in their struggles. As much as technology presents risks for human rights defenders, it also offers opportunities for us to better protect ourselves and strengthen our movements. Technology can bring more visibility to our movements, and it can empower our work by allowing us to connect with other people and learn new strategies.
Is there one moment where you realized how to connect what you’ve been doing with feminism with technology?
In my work, my perspective as a feminist helps me centre the experiences and needs of marginalised people for trainings and outreach. It is important for me to publicly identify as an Afrofeminist in a society where there is impunity for gendered and racist violence that occurs every day. In Honduras we need to put our energy into supporting these communities whose rights are most violated, and whose stories are invisible.
For example, in 2006, I was working with a Union to install the Ubuntu operating system (an open-source operating system) on their computers. We realized that the unionists didn’t know how to use a computer, so we created a space for digital literacy and learning about how to use a computer at the same time. This became not just a teaching exercise, but an exercise for me to figure out how to connect these tools to what people are interested in. Something clicked for me in this moment, and this experience helped solidify my approach to working on technology and human rights.
There are not many women working in technology and human rights. I don’t want to be one of the only women, so my goal is to see more women colleagues working on technical issues. I want to make it possible for women to work in this field. I also want to motivate more women to create change within the intersection of technology and human rights. Using a feminist perspective and approach, we ask big questions about how we are doing the work, what our approach needs to be, and who we need to work with. Nikole Yanez Honduras Human Rights Defender
For me, building a feminist internet means building an internet for everyone. This means creating a space where we do not reproduce sexist violence, where we find a community that responds to the people, to the groups, and to the organizations that fight for human rights. This includes involving women and marginalised people in building the infrastructure, in the configuration of servers, and in the development of protocols for how we use all these tools.
In Honduras, there aren’t many people trained in digital forensics analysis, yet there are organizations that are always seeking me out to help check their phones. The fellowship helped me learn about forensic analysis on phones and computers and tied the learning to what I’m actually doing in my area with different organizations and women’s rights defenders. The fellowship was practical and rooted in the experience of civil society organizations.
How do you explain the importance of digital forensics? Well first, it’s incredibly relevant for women rights defenders. Everyone wants to know if their phone has been hacked. That’s the first thing they ask:, “Can you actually know whether your phone has been hacked?” and “How do I know? Can you do it for me? How?” Those are the things that come up in my trainings and conversations.
I like to help people to think about protection as a process, something ongoing, because we use technology all day long. There are organizations and people that take years to understand that. So, it’s not something that can be achieved in a single conversation. Sometimes a lot of things need to happen, including bad things, before people really take this topic seriously…
I try to use very basic tools when I’m doing digital security support, to say you can do this on whatever device you’re on, this is a prevention tool. It’s not just applying technical knowledge, it’s also a process of explaining, training, showing how this work is not just for hackers or people who know a lot about computers.
One of the challenges is to spread awareness about cybersecurity among Indigenous and grassroots organizations, which aren’t hyper-connected and don’t think that digital forensics work is relevant to them. Sometimes what we do is completely disconnected from their lives, and they ask us: “But what are you doing?” So, our job is to understand their questions and where they are coming from and ground our knowledge-sharing in what people are actually doing.
To someone reading this piece and saying, oh, this kind of resonates with me, where do I start, what would your recommendation be?
If you are a human rights defender, I would recommend that you share your knowledge with your collective. You can teach them the importance of knowing about them, practicing them, as well as encouraging training to prevent digital attacks, because, in the end, forensic analysis is a reaction to something that has happened.
We can take a lot of preventive measures to ensure the smallest possible impact. That’s the best way to start. And it’s crucial to stay informed, to keep reading, to stay up to date with the news and build community.
If there are girls or gender non-conforming people reading this who are interested in technical issues, it doesn’t matter if you don’t have a degree or a formal education, as long as you like it. Most hackers I’ve met become hackers because they dive into a subject, they like it and they’re passionate about it.Nikole Yanez Honduras Human Rights Defender.
On 22 January 2024, Amnesty International published an interesting piece by Alex, a 31-year-old Romanian activist working at the intersection of human rights, technology and public policy.
Seeking to use her experience and knowledge of tech for political change, Alex applied and was accepted onto the Digital Forensics Fellowship led by the Security Lab at Amnesty Tech. The Digital Forensics Fellowship (DFF) is an opportunity for human rights defenders (HRDs) working at the nexus of human rights and technology and expand their learning.
Here, Alex shares her activism journey and insight into how like-minded human rights defenders can join the fight against spyware:
In the summer of 2022, I watched a recording of Claudio Guarnieri, former Head of the Amnesty Tech Security Lab, presenting about Security Without Borders at the 2016 Chaos Communication Congress. After following the investigations of the Pegasus Project and other projects centring on spyware being used on journalists and human rights defenders, his call to action at the end — “Find a cause and assist others” — resonated with me long after I watched the talk.
Becoming a tech activist
A few days later, Amnesty Tech announced the launch of the Digital Forensics Fellowship (DFF). It was serendipity, and I didn’t question it. At that point, I had already pushed myself to seek out a more political, more involved way to share my knowledge. Not tech for the sake of tech, but tech activism to ensure political change.
Alex is a 31-year-old Romanian activist, working at the intersection of human rights, technology and public policy.
I followed an atypical path for a technologist. Prior to university, I dreamt of being a published fiction author, only to switch to studying industrial automation in college. I spent five years as a developer in the IT industry and two as Chief Technology Officer for an NGO, where I finally found myself using my tech knowledge to support journalists and activists.
My approach to technology, like my approach to art, is informed by political struggles, as well as the questioning of how one can lead a good life. My advocacy for digital rights follows this thread. For me, technology is merely one of many tools at the disposal of humanity, and it should never be a barrier to decent living, nor an oppressive tool for anyone.
Technology is merely one of many tools at the disposal of humanity. It should never be a barrier to decent living, nor an oppressive tool for anyone.
The opportunity offered by the DFF matched my interests and the direction I wanted to take my activism. During the year-long training programme from 2022-2023, the things I learned turned out to be valuable for my advocacy work.
In 2022, the Child Sexual Abuse Regulation was proposed in the EU. I focused on conducting advocacy to make it as clear as possible that losing encrypted communication would make life decidedly worse for everyone in the EU. We ran a campaign to raise awareness of the importance of end-to-end encryption for journalists, activists and people in general. Our communication unfolded under the banner of “you don’t realize how precious encryption is until you’ve lost it”. Apti.ro, the Romanian non-profit organisation that I work with, also participated in the EU-wide campaign, as part of the EDRi coalition. To add fuel to the fire, spyware scandals erupted across the EU. My home country, Romania, borders countries where spyware has been proven to have been used to invade the personal lives of journalists, political opponents of the government and human rights defenders.
The meaning of being a Fellow
The Security Lab provided us with theoretical and practical sessions on digital forensics, while the cohort was a safe, vibrant space to discuss challenges we were facing. We debugged together and discussed awful surveillance technology at length, contributing our own local perspective.
The importance of building cross-border networks of cooperation and solidarity became clear to me during the DFF. I heard stories of struggles from people involved in large and small organizations alike. I am convinced our struggles are intertwined, and we should join forces whenever possible.
Now when I’m working with other activists, I try not to talk of “forensics”. Instead, I talk about keeping ourselves safe, and our conversations private. Often, discussions we have as activists are about caring for a particular part of our lives – our safety when protesting, our confidentiality when organizing, our privacy when convening online. Our devices and data are part of this process, as is our physical body. At the end of the day, digital forensics are just another form of caring for ourselves.
I try to shape discussions about people’s devices similarly to how doctors discuss the symptoms of an illness. The person whose device is at the centre of the discussion is the best judge of the symptoms, and it’s important to never minimize their apprehension. It’s also important to go through the steps of the forensics in a way that allows them to understand what is happening and what the purpose of the procedure is.
I never use a one-size-fits-all approach because the situation of the person who owns a device informs the ways it might be targeted or infected.
The human approach to technology
My work is human-centred and technology-focused and requires care and concentration to achieve meaningful results. For activists interested in working on digital forensics, start by digging deep into the threats you see in your local context. If numerous phishing campaigns are unfolding, dig into network forensics and map out the owners of the domains and the infrastructure.
Secondly, get to know the person you are working with. If they are interested in secure communications, help them gain a better understanding of mobile network-based attacks, as well as suggesting instant messaging apps that preserve the privacy and the security of their users. In time, they will be able to spot “empty words” used to market messaging apps that are not end-to-end encrypted.
Finally, to stay true to the part of me that loves a well-told story, read not only reports of ongoing spyware campaigns, but narrative explorations from people involved. “Pegasus: The Story of the World’s Most Dangerous Spyware” by Laurent Richard and Sandrine Rigaud is a good example that documents both the human and the technical aspects. The Shoot the Messenger podcast, by PRX and Exile Content Studio, is also great as it focuses on Pegasus, starting from the brutal murder of Jamal Khashoggi to the recent infection of the device of journalist and founder of Meduza, Galina Timchenko.
We must continue to do this research, however difficult it may be, and to tell the stories of those impacted by these invasive espionage tactics. Without this work we wouldn’t be making the political progress we’ve seen to stem the development and use of this atrocious technology.
Danna joins HURIDOCS from the Amnesty Tech management team, where she played an integral role in growing globally distributed teams, securing and managing large grants, and providing strategic and operational leadership. She combines perceptive and empathetic leadership with a bright, organised, fearless approach to building organisational strength and resilience. See: https://humanrightsdefenders.blog/tag/danna-ingleton/
“At a time when the power of accurate, accessible and secure information has never been more important to those seeking justice and the fulfilment of their human rights, I am thrilled to be starting as the new Executive Director of HURIDOCS.”
It is exciting to be joining an organisation with such a rich history of harnessing the power of information to facilitate change. Together with my new colleagues and our diverse, valiant partners we will build on this history to ensure HURIDOCS is consistently at the sharp-edge of information management and technological developments, and always strategically growing.
As an activist myself who has been working in human rights for more than a decade I have seen how the battle for justice can take its toll on the people behind the movements. I am therefore also committed to ensuring HURIDOCS is an effective and accountable workplace that values health and the well-being of us all. – Danna Ingleton
Danna will officially assume her responsibilities on 1 July 2023
Olga Solovyeva of Advox, a Global Voices project dedicated to protecting freedom of expression online, posted on 19 April 2023 a piece statign that the impact of technology on politics cannot be ignored anymore. It is a long piece that I copy in its totality as it is worth reading and of great rlevance for human rights defenders:
Amidst the rising influence of technology in global politics, particularly in authoritarian regimes, the imperative to acknowledge the political accountability of tech corporations has become increasingly apparent. In recent years, the ramifications of disregarding ethical practices underscore the urgent need for tech companies to prioritize responsible conduct. The manipulation of information online, traffic rerouting, restricting access to the internet, and operating surveillance are some examples of how states can misuse technology. While technology was once expected to become a symbol of resistance and liberation, illiberal regimes now use it to produce various forms of digital unfreedom that extend into material reality. But how do we ensure that Big Tech contributes to democratic practices rather than political oppression?
Why do tech companies have political responsibility?
In an innovation driven sector like technology, legislation cannot keep pace with new developments. Often, neither users nor makers consider the negative consequences of a new technology until they have experienced them, and the industry is left struggling with the ramifications of harm and, as a consequence, its own expanding responsibilities.
Digital activists from Global Voices Advox report on the growing use of digital technology for advancing authoritarian regimes worldwide, focusing, among others, on issues such as surveillance, mis/disinformation and access to the internet in different contexts. Autocrats use the whole scale of digital technologies available. In Russia, where the interest of the state lies in keeping opposition views from the information environment, there is a strong emphasis on disinformation and censorship. Tanzania and Sudan are known for internet shutdowns, while in Turkey and Morocco, cases of public digital surveillance have become more common.
At the same time, the tech sector does not necessarily play on the dark side only. Since the Russian invasion of Ukraine, Elon Musk’s SpaceX continued to support Starlink and provide internet access in Ukraine after the Russian invasion disrupted services. And yet, his recent purchase of Twitter brought multiple controversies, further empowering the attention economy of social media, which leads to fragmentation, polarisation and the decline of the public sphere. It’s impossible to separate tech companies from politics, and their role tends to cause controversy.
Good apple, bad apple
If you’re reading this text from your MacBook or iPhone, you probably have recognized the difference between living in a new information space with much less targeted advertising. In February 2022, Apple introduced its new privacy features allowing users to enable or block personal data tracing from the apps installed on the company’s devices, an innovation with significant political, social and economic consequences.
It’s crucial to understand the business decision that underpins the ongoing debate on personal data ethics and regulation. Protecting Apple users’ personal data means they will not be targeted with personally crafted advertising, and their data will not be used to predict consumer behaviour, which enables users’ right to privacy — one of the central categories of online service providers’ moral responsibilities and, essentially, a human right. This guarantee of the right attracts consumers to Apple products.
At the same time, this architectural decision caused significant distress to the market, as the stock prices of Meta and other social media companies plunged that day. Introducing an opt-out particularly for personal data collection means shrinking their potential advertising revenues as less data becomes available to develop personalized ads.
Apple made a policy-level decision, a milestone in the discussion on issues of user privacy regulation. Effectively, it is a subject of government concern on the intersection of information and business ethics, law and policy. This case illustrates the power of one company, which can be not just a game changer in the conversation on tech regulation but a shock for the industry, pushing other businesses to shift their business models and challenge the dynamics of Big Tech.
What is this decision for Apple? An enactment of an ethical stand signalling its political responsibility? An act of an excellent corporate citizen innovating to enable its customers’ rights for privacy? Or is it a marketing move to boost the sale of Apple products through engaging in a non-market activity? Regardless of the motivation, we have witnessed a tech company making a political change on an international level, since Apple products are in demand and sold worldwide.
At the same time, the company engages in other activities that may be seen as controversial. Along with other Big Tech companies, Apple increased its lobbying spending in 2022 as businesses face increased pressure from lawmakers raising antitrust concerns to curb the power of tech giants. Meanwhile, stepping outside the liberal democratic political climate, Apple faces decisions that challenge its political stand. In 2021 the company confirmed storing all personal data of Chinese users inside China-based data centres. China is known for using surveillance as a tool for political prosecution. Even though Apple claimed to maintain a high level of security, journalist sources report that the company handed over the keys to the government. The same year, Apple removed a smart voting app, one of the tools developed by the opposition in Russia to outplay electoral fraud. In both cases, the company’s decision-making had severe and direct political consequences, just like the decision to block personal data tracing on its devices. The only difference was the kind of pressure put on a company by the political system it was operating in.
Where does the political responsibility of Big Tech end?
In 2022 the world saw the global expansion of authoritarian rule, affecting developing states and established democracies. According to the 2022 Freedom House report, only 20 percent of the earth’s population live in a free country, while the remaining 80 percent are equally split between a partially free and not free world. The world is getting more authoritarian, and the political regime of a liberal democracy today is the exception rather than the rule.
Different autocracies pose challenging obstacles to tech companies, which remain the key producers of innovative technology. The role of the state defines the potential expectations of business, and their relationship patterns. In autocracies, political participation and public deliberation face repression through state authorities, and business is shaped by a political economy with the elements of state intervention. The state prevails, and it has more direct control over the company when needed, and the interference in economic life is ordinary and unpredictable. Autocrats are famous for censorship, propaganda, and interventions in electoral systems, all of which are delivered by technology provided by business.
One of the most common examples could be the situation in which a business organization has to obey the law of an authoritarian state to maintain political legitimacy, while the law itself may undermine the moral legitimacy of the company. The case of Apple in China is an example of this. However, it can have different consequences for companies in other countries. For instance, Verizon (the subsidiary that bought out Yahoo! in 2017) was sued for handing data to the Chinese government that led to political prosecution and the torture of dissidents. In authoritarian regimes, legislation is often designed to set out the specific requirements and processes for government agencies to obtain access to personal data, including surveillance purposes. Even though data handovers upon the request, e.g., the subpoena, are common for democratic regimes as well, the difference is how such data is further used and whether there are grounds for balancing it out with other institutional procedures.
Elaborating on the political responsibility of Big Tech
As the intersection of technology and politics continues to expand, grappling with the political implications of new creations becomes imperative for tech innovators. They must take proactive steps to develop robust political responsibility strategies while navigating authoritarian and other ethically fraught environments. Transparency is one way to meet these goals.
The practice of environmental social and governance (ESG) reporting and disclosure on ESG issues is an excellent example of how mandated transparency has led to accountability, and one that can be adapted to technological innovation. Openly revealing who has bought a certain technology will limit the ability of authoritarian governments to abuse it, for example. Additionally, integrating political responsibility as a part of responsible investment portfolios could represent a meaningful step forward to starting an open dialogue about tech, politics and society. This could be done by disclosing on direct political engagement of companies and adding additional transparency about contexts in which business operates.
Yet, such openness would be even more problematic — and potentially impossible — for tech companies that have been developed within the borders and hence the jurisdiction of authoritarian regimes. One of the most illustrative examples is the case of Yandex, a multinational company headquartered in Russia. The company has grown into a major tech player, often referred to as the “Russian Google.” Despite making an occasional compromise with the political system, the company kept the reputation of the most liberal company in the country while showing steady business growth.
However, when Russia invaded Ukraine in February 2022, Yandex faced significant pressure, legislative restrictions, international sanctions and criticism from the public. From the first weeks of the war, YandexNews, daily visited by 40 million people, has been indexing only stories from state-owned media, amplifying the narratives of the “special operation.” Abiding by the law became equivalent to contributing to univocal media coverage dominated by the Russian state.
The war became the most significant trigger that affected the company, as the share price of this prominent business lost over 75 percent of its value. Many company employees, including top management, resigned or left the country in protest of the war led by Russia. Personal sanctions were applied to the company’s CEO and founder. Under pressure, the company sold their media assets to a holding loyal to the state. In December, the company’s founder left Yandex Russia but remained the key shareholder.
Scenarios like these establish a controversial ground for businesses that must come to terms with an authoritarian state’s rules to keep their business going. Albert Hirshman’s “Exit, Voice, Loyalty: Responses to Decline in Firms, Organizations and States” suggests a framework of three strategies for responding to the perceived decrease in performance of an organization or a state. Using it as a guide to an organizational strategy, a tech company facing authoritarianism could leave, protest or comply. However, as the suppression of public dissent usually characterizes authoritarianism, realistically, only two strategies are left: to stay or to go.
Nevertheless, both strategies bring further ethical concerns. With a lot said about the downsides of collaborating with autocrats, how ethical is it towards the employees and customers for a business to leave the declining state? Moreover, the business remains a profit-generating enterprise first of all, and very few countries in the world would make a market for a product so the company’s leadership could keep to the standard of political responsibility. We can’t all live in Norway, after all.
As the influence of tech companies continues to grow, it falls to civil society, journalists, tech users, and watchdog organisations to keep these firms accountable. Demanding transparency and collaborating to come up with new fair policies that could support tech companies in tough contexts could be one way forward. Meanwhile, it is important to educate the public and create incentives for consuming tech other than instant gratification. By working together, these stakeholders can start shaping a more ethical tech landscape, where common good carries more weight than corporate interest.
Towards Life 3.0: Ethics and Technology in the 21st Century is a talk series organized and facilitated by Dr. Mathias Risse, Director of the Carr Center for Human Rights Policy, and Berthold Beitz Professor in Human Rights, Global Affairs, and Philosophy. Drawing inspiration from the title of Max Tegmark’s book, Life 3.0: Being Human in the Age of Artificial Intelligence, the series draws upon a range of scholars, technology leaders, and public interest technologists to address the ethical aspects of the long-term impact of artificial intelligence on society and human life.
On 20 April you can join for 45 minutes with WITNESS’ new Executive Director Sam Gregory [see: https://humanrightsdefenders.blog/2023/04/05/sam-gregory-finally-in-the-lead-at-witness/]o n how AI is changing the media and information landscape; the creative opportunities for activists and threats to truth created by synthetic image, video, and audio; and the people and places being impacted but left out of the current conversation.
Sam says “Don’t let the hype-cycle around ChatGPT and Midjourney pull you into panic, WITNESS has been preparing for this moment for the past decade with foundational research and global advocacy on synthetic and manipulated media. Through structured work with human rights defenders, journalists, and technologists on four continents, we’ve identified the most pressing concerns posed by these emerging technologies and concrete recommendations on what we must do now.
We have been listening to critical voices around the globe to anticipate and design thoughtful responses to the impact of deepfakes and generative AI on our ability to discern the truth. WITNESS has proactively worked on responsible practices for synthetic media as a part of the Partnership on AI and helped develop technical standards to understand media origins and edits with the C2PA. We have directly influenced standards for authenticity infrastructure and continue to forcefully advocate for centering equity and human rights concerns in the development of detection technologies. We are convening with the people in our communities who have most to gain and lose from these technologies to hear what they want and need, most recently in Kenya at the #GenAIAfrica convening”.
Jon Stone in the Independent of 13 july 2020 wrote about the UK Government being urged to explain £75m exports to countries rated ‘not free’. The British government is providing more than a dozen repressive regimes around the world with wiretaps, spyware and other telecommunications interception equipment they could use to spy on dissidents, public records show. Despite rules saying the UK should not export security goods to countries that might use them for internal repression, ministers have signed off more than £75m in such exports over the past five years to states rated “not free” by the NGO Freedom House.
The 17 countries include China, Saudi Arabia and Bahrain, as well as the United Arab Emirates, which was the biggest recipient of licences totalling £11.5m alone since 2015….One such beneficiary of the UK’s exports is Hong Kong, which had a £2m shipment approved last year despite ongoing repression of pro-democracy protests. The Philippines, where police extrajudicial killings are rampant, has also provided steady business for British firms hawking surveillance systems.,,
A government spokesperson said blandly : “The government takes its export responsibilities seriously and assesses all export licences in accordance with strict licensing criteria. We will not issue any export licences where to do so would be inconsistent with these criteria.” But Oliver Feeley-Sprague, Amnesty International UK’s programme director for military, security and police affairs, said the UK did not seem to be undertaking proper risk assessments when selling such equipment and said the government’s controls were becoming “notorious” for their “faulty decision-making”…
“With numerous human rights defenders arrested and jailed in countries like Saudi Arabia, the UAE and Turkey in the past five years, there’s a greater need than ever for the UK to be absolutely scrupulous in assessing the risk of UK telecoms technology being used unlawfully against human rights activists, journalists, and peaceful opposition figures.
“It’s just not clear that the UK is undertaking proper risk assessments when selling this equipment, and it’s not clear whether UK officials are making any effort to track how the equipment is used in one, two or three years’ time.
This week international trade secretary Liz Truss announced the UK would be resuming arms exports to Saudi Arabia, after a court had previously ordered that they were suspended. The government said it had reviewed claims that Saudi forces in Yemen had breached international humanitarian law and said any possible breaches were “isolated incidents” because they had happened in different places and different ways.
Andrew Smith of Campaign Against Arms Trade said the sale of the spying equipment raised “serious questions and concerns”.
As the Universal Declaration of Human Rights turns 70 – is it time for a new approach? asks Barbara von Ow-Freytag, Journalist, political scientist and adviser, Prague Civil Society Centre, in the World Economic Forum.This piece is certainly worth reading as a whole. It is close to my heart in that it stresses the need to have a hard look at how young human rights defenders focus their energy where they can achieve real, concrete change within their own communities. Their campaigns are grassroots-led and use local languages and issues their communities understand. They often use technology and creative formats, with a heavy dose of visual and artistic elements. Where the international scene seems to stagnate and even backpedal, better use of communication skills and tools (such as images) are certainly part of the answer:
As the Universal Declaration of Human Rights turns 70, a new generation of human rights defenders are reinventing themselves to fight for old rights amid a new world order. Based not on declarations, charters and international bodies, but on the values which underpin them – justice, fairness, equality – they shun the language of their predecessors while embracing the same struggle…However, in the new realities of the 21st century, the mechanisms to promote human rights that grew out of the Universal Declaration are showing their age. Authoritarianism is on the rise across the world, with popular leaders cracking down on human rights defenders.
Freedom House found 2018 was the 12th consecutive year that the world became less free. Civicus, which specifically monitors the conditions for civil society activists and human rights defenders, found civil society was “under attack” in more countries than it wasn’t, with all post-Soviet countries (except Georgia) ranging between “obstructed” and “closed”.
Image: Freedom House
Troublingly, both the willingness and the ability of Western bastions of human rights are also on the wane. Inside the EU, talk of illiberal democracy gains traction, and internal crises divert attention away from the global stage. Perhaps unsurprisingly, throughout Eastern Europe and the former Soviet Union, younger activists and civil society are giving up on western governments and international organizations to advocate on their behalf. Pavel Chikov, director of the Agora group, said recently that, “Russian human rights groups no longer have a role model,” calling the liberal human rights agenda “obsolete”.
Growing disillusionment has led many rights groups to shift away from appealing to outsiders for support. Younger campaigners no longer frame their work in the traditional language of human rights, and many do not even consider themselves human rights defenders. Instead of referring to international agreements violated, they focus on solving practical problems, or creating their own opportunities to advance values of equality, justice and fairness.
Formats too have changed. Throughout the region, tools used by civil society to raise social consciousness are becoming diverse, dynamic and smart. Instead of one-person legal tour de forces, genuinely grassroots, tech-powered, peer-to-peer or horizontal networks are proving effective. Media, music, art, film, innovative street protests, urbanism and online initiatives focused on local communities are coming to replace petitions and international advocacy.
Team 29, an association of Russian human rights lawyers and journalists, is among the most successful of this new generation. It has repositioned itself as part-legal aid provider, part-media outlet. Its website offers a new mix of news on ongoing trials, animated online handbooks for protesters, videos on torture and a new interactive game telling young people how to behave if they are detained by police.
What may look like PR-friendly add-ons are actually core to their operation. Anastasia Andreeva, the team’s media expert, says: “Before, we consulted some 30 clients, now we reach tens of thousands of people.”
Azerbaijani activist Emin Milli also embodies this journey of wider civil society – turning away from the international towards local solutions. In the early 2000s, he was a traditional human rights defender, successfully using international mechanisms, such as the Council of Europe to assist political prisoners.
The key to Meydan’s success is its accessibility. Milli says: “We do stories about ordinary people. Real Azeris who have everyday problems.” Through its smart coverage, investigating and highlighting how injustice affects these ordinary people, and not referring to UN-enshrined rights and responsibilities, Meydan is “giving a voice to people who fight for women’s rights, people who fight for political rights, for civil liberties, and everybody who feels they are voiceless”.
Music, too, is increasingly being used as a vehicle to realize human rights. Though he might shun the label, Azeri rapper Jamal Ali is perhaps one of the country’s most well-known “human rights defenders”. His songs about injustice and corruption regularly go viral, raising national and international awareness in the same way a statement at the UN General Assembly might have done three decades ago.
In a 2017 hit, he highlighted how two young men had been tortured by police and faced 10 years in prison for spraying graffiti on a statue of former president Heydar Aliyev. In response, the regime arrested Ali’s mother, demanding that he remove the video from YouTube, only to ensure that Ali’s song went even more viral among Azeri youngsters.
Gender equality and women’s rights is also being advanced through unexpected new champions. In Kyrgyzstan, 20-year-old singer Zere Asylbek sparked a feminist shockwave earlier this year with her video Kyz (“Girl”). “Don’t tell me what to wear, don’t tell me how to behave,” she sings, bearing her top to reveal her bra. Seen by millions, the Kyrgyz-language feminist anthem has set off a new #MeToo debate in the Central Asian country, where many young women are still abducted, raped and forced to marry.
In the wake of the video, a first “feminist bar” is about to open in Bishkek. Other feminist videos have been used to directly address the issue of bride-kidnapping, with animated cartoons being used as part of local campaigns to change mindsets in a conservative society.
Perhaps most excitingly, an all-female team of 18 to 20-year-olds is building the country’s first micro-satellite. “Girls taking us into space is the best message against sexism,” says Bektour Iskender, whose news site Kloop initiated the project. He says the girls’ project has a deep social mission, promoting national pride and the country’s return to advanced technological development.
These examples – and countless more – show that civic groups see no value in lobbying an increasingly disinterested West and sluggish international organizations. Instead they focus their energy where they can achieve real, concrete change within their own communities. Their campaigns are grassroots-led and use local languages and issues their communities understand. They target specific audiences, often using technology and creative formats, with a heavy dose of visual and artistic elements.
Addressing discrimination, environmental protection, corruption, health issues, women’s rights, they speak not about the failure of their states to abide by international accords, but about common dignity and life opportunities, addressing people on a direct human level.
Clearly, the values of the Universal Declaration of Human Rights are still valid, but their approach and the packaging have changed. “We all want to change the world,” says Sergey Karpov of the Russian online media and philanthropic platform Takie Dela. “Today communications are the best way”.
Yesterday (18 June 2015) Amnesty Internationalannounced something that is (rather will be) something new in human rights education: a series of Massive Open Online Courses (MOOCs). Who knows, the horrible acronym may one day be as normal as HRDs or AI itself. For this to come about Amnesty International is partnering with edX, a global leader in online education founded by Harvard University and MIT. The first MOOCs will be available later this year. The free online courses will be designed by human rights and education experts from across Amnesty International.
On 1 December 2014 a group of 7 NGOs (Amnesty International, Digitale Gesellschaft, International Federation for Human Rights, Human Rights Watch, Open Technology Institute (at New America), Privacy International, Reporters sans frontieres) sent an Open Letter to the “Wassenaar Arrangement” (for what this is see link at the end). The key issue is that the alarming proliferation of surveillance technologies available to repressive countries adversely affects political activists, human rights defenders, refugees, dissidents and journalists.
Here is the text of the letter:
“We, the undersigned organisations, call upon the 41 Governments that compose the Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies, to take action and address the alarming proliferation of surveillance technologies available to repressive countries involved in committing systematic human rights violations. This trade results in unlawful surveillance, which often leads to further human rights violations including invasions of privacy, arbitrary arrest and detention, torture and other cruel, inhuman or degrading treatment or punishment, the silencing of free expression, preventing political participation, and crushing offline and online dissent.
Surveillance technologies are not simply harmless tools. In the wrong hands they are often used as a tool of repression. Evidence is continuing to reveal the extent of this secretive trade that puts countless individuals at direct risk from human rights abusing governments. More and more stories emerge showing these damaging and often unlawful technologies affecting political activists, human rights defenders, refugees, dissidents and journalists, with some technologies placing entire populations under surveillance. Governments with internationally condemned human rights records such as Bahrain, Ethiopia, Egypt, Turkmenistan, Libya, Syria and Iran have all purchased surveillance technologies from private companies, and have used them to facilitate a variety of human rights violations. Some revelations in France, Germany, the UK, and the US have led to police and judicial investigations following calls from NGOs and members of the Coalition Against Unlawful Surveillance Exports. Remarkably and despite mounting evidence of associated abuses, surveillance technology companies still openly market their products at ‘trade fairs’ across the UK, France, US, Brazil and the UAE among other countries.
Although steps were taken in 2013 to address this largely unregulated global market, governments cannot let the momentum halt. Governments have now included additional technologies associated with intrusion software and IP monitoring to the Lists of Dual Use Goods and Technologies and Munitions, and are aware of the impact surveillance technologies can have on human rights. There is now a pressing need to modernise out of date export controls. In addition, technologies such as undersea fibre-optic cable taps, monitoring centres, and mass voice / speaker recognition technologies urgently need to be examined for their impact on human rights and internal repression, particularly when the end user is a government known for committing human rights violations. Technologies evolve at a rapid pace and governments that abuse human rights take advantage of weak regulation, the product of poor understanding of the technologies and their capabilities.
In the current system, human rights and digital rights groups, as well as external independent experts, are excluded from contributing their expertise and knowledge to the Wassenaar Arrangement forum. The additional expertise and knowledge that civil society can bring to the debate is invaluable to this end. Discussions should not continue in a closed-forum manner and we urge governments to engage with civil society organisations to help ensure that accurate and effective controls are developed which reflect modern technological developments and do not impede legitimate scientific and security research.
Any export policy relating to surveillance technologies should place human rights at its heart. Governments must exercise a strict policy of restraint and should refuse to grant export licenses for surveillance technology destined for end-users in countries where they are likely to be used in an unlawful manner i.e. not compliant with human rights legal standards. Governments should consider the weakness or absence of an appropriate legal framework in the recipient country to ensure the transfer would not pose a substantial risk of the items being used to violate or abuse human rights. Governments should also be transparent in what they export, and to whom and support the development of an international legal framework to address the sale and trade of surveillance technologies.”
The Wassenaar Arrangement (41 participating States) has been established in order to contribute to regional and international security and stability, by promoting transparency and greater responsibility in transfers of conventional arms and dual-use goods and technologies, thus preventing destabilising accumulations. Participating States seek, through their national policies, to ensure that transfers of these items do not contribute to the development or enhancement of military capabilities which undermine these goals, and are not diverted to support such capabilities.