On 22 January 2024, Amnesty International published an interesting piece by Alex, a 31-year-old Romanian activist working at the intersection of human rights, technology and public policy.
Seeking to use her experience and knowledge of tech for political change, Alex applied and was accepted onto the Digital Forensics Fellowship led by the Security Lab at Amnesty Tech. The Digital Forensics Fellowship (DFF) is an opportunity for human rights defenders (HRDs) working at the nexus of human rights and technology and expand their learning.
Here, Alex shares her activism journey and insight into how like-minded human rights defenders can join the fight against spyware:
In the summer of 2022, I watched a recording of Claudio Guarnieri, former Head of the Amnesty Tech Security Lab, presenting about Security Without Borders at the 2016 Chaos Communication Congress. After following the investigations of the Pegasus Project and other projects centring on spyware being used on journalists and human rights defenders, his call to action at the end — “Find a cause and assist others” — resonated with me long after I watched the talk.
Becoming a tech activist
A few days later, Amnesty Tech announced the launch of the Digital Forensics Fellowship (DFF). It was serendipity, and I didn’t question it. At that point, I had already pushed myself to seek out a more political, more involved way to share my knowledge. Not tech for the sake of tech, but tech activism to ensure political change.
I followed an atypical path for a technologist. Prior to university, I dreamt of being a published fiction author, only to switch to studying industrial automation in college. I spent five years as a developer in the IT industry and two as Chief Technology Officer for an NGO, where I finally found myself using my tech knowledge to support journalists and activists.
My approach to technology, like my approach to art, is informed by political struggles, as well as the questioning of how one can lead a good life. My advocacy for digital rights follows this thread. For me, technology is merely one of many tools at the disposal of humanity, and it should never be a barrier to decent living, nor an oppressive tool for anyone.
Technology is merely one of many tools at the disposal of humanity. It should never be a barrier to decent living, nor an oppressive tool for anyone.
The opportunity offered by the DFF matched my interests and the direction I wanted to take my activism. During the year-long training programme from 2022-2023, the things I learned turned out to be valuable for my advocacy work.
In 2022, the Child Sexual Abuse Regulation was proposed in the EU. I focused on conducting advocacy to make it as clear as possible that losing encrypted communication would make life decidedly worse for everyone in the EU. We ran a campaign to raise awareness of the importance of end-to-end encryption for journalists, activists and people in general. Our communication unfolded under the banner of “you don’t realize how precious encryption is until you’ve lost it”. Apti.ro, the Romanian non-profit organisation that I work with, also participated in the EU-wide campaign, as part of the EDRi coalition. To add fuel to the fire, spyware scandals erupted across the EU. My home country, Romania, borders countries where spyware has been proven to have been used to invade the personal lives of journalists, political opponents of the government and human rights defenders.
The meaning of being a Fellow
The Security Lab provided us with theoretical and practical sessions on digital forensics, while the cohort was a safe, vibrant space to discuss challenges we were facing. We debugged together and discussed awful surveillance technology at length, contributing our own local perspective.
The importance of building cross-border networks of cooperation and solidarity became clear to me during the DFF. I heard stories of struggles from people involved in large and small organizations alike. I am convinced our struggles are intertwined, and we should join forces whenever possible.
Now when I’m working with other activists, I try not to talk of “forensics”. Instead, I talk about keeping ourselves safe, and our conversations private. Often, discussions we have as activists are about caring for a particular part of our lives – our safety when protesting, our confidentiality when organizing, our privacy when convening online. Our devices and data are part of this process, as is our physical body. At the end of the day, digital forensics are just another form of caring for ourselves.
I try to shape discussions about people’s devices similarly to how doctors discuss the symptoms of an illness. The person whose device is at the centre of the discussion is the best judge of the symptoms, and it’s important to never minimize their apprehension. It’s also important to go through the steps of the forensics in a way that allows them to understand what is happening and what the purpose of the procedure is.
I never use a one-size-fits-all approach because the situation of the person who owns a device informs the ways it might be targeted or infected.
The human approach to technology
My work is human-centred and technology-focused and requires care and concentration to achieve meaningful results. For activists interested in working on digital forensics, start by digging deep into the threats you see in your local context. If numerous phishing campaigns are unfolding, dig into network forensics and map out the owners of the domains and the infrastructure.
Secondly, get to know the person you are working with. If they are interested in secure communications, help them gain a better understanding of mobile network-based attacks, as well as suggesting instant messaging apps that preserve the privacy and the security of their users. In time, they will be able to spot “empty words” used to market messaging apps that are not end-to-end encrypted.
Finally, to stay true to the part of me that loves a well-told story, read not only reports of ongoing spyware campaigns, but narrative explorations from people involved. “Pegasus: The Story of the World’s Most Dangerous Spyware” by Laurent Richard and Sandrine Rigaud is a good example that documents both the human and the technical aspects. The Shoot the Messenger podcast, by PRX and Exile Content Studio, is also great as it focuses on Pegasus, starting from the brutal murder of Jamal Khashoggi to the recent infection of the device of journalist and founder of Meduza, Galina Timchenko.
We must continue to do this research, however difficult it may be, and to tell the stories of those impacted by these invasive espionage tactics. Without this work we wouldn’t be making the political progress we’ve seen to stem the development and use of this atrocious technology.
Danna joins HURIDOCS from the Amnesty Tech management team, where she played an integral role in growing globally distributed teams, securing and managing large grants, and providing strategic and operational leadership. She combines perceptive and empathetic leadership with a bright, organised, fearless approach to building organisational strength and resilience. See: https://humanrightsdefenders.blog/tag/danna-ingleton/
“At a time when the power of accurate, accessible and secure information has never been more important to those seeking justice and the fulfilment of their human rights, I am thrilled to be starting as the new Executive Director of HURIDOCS.”
It is exciting to be joining an organisation with such a rich history of harnessing the power of information to facilitate change. Together with my new colleagues and our diverse, valiant partners we will build on this history to ensure HURIDOCS is consistently at the sharp-edge of information management and technological developments, and always strategically growing.
As an activist myself who has been working in human rights for more than a decade I have seen how the battle for justice can take its toll on the people behind the movements. I am therefore also committed to ensuring HURIDOCS is an effective and accountable workplace that values health and the well-being of us all. – Danna Ingleton
Danna will officially assume her responsibilities on 1 July 2023
Olga Solovyeva of Advox, a Global Voices project dedicated to protecting freedom of expression online, posted on 19 April 2023 a piece statign that the impact of technology on politics cannot be ignored anymore. It is a long piece that I copy in its totality as it is worth reading and of great rlevance for human rights defenders:
Amidst the rising influence of technology in global politics, particularly in authoritarian regimes, the imperative to acknowledge the political accountability of tech corporations has become increasingly apparent. In recent years, the ramifications of disregarding ethical practices underscore the urgent need for tech companies to prioritize responsible conduct. The manipulation of information online, traffic rerouting, restricting access to the internet, and operating surveillance are some examples of how states can misuse technology. While technology was once expected to become a symbol of resistance and liberation, illiberal regimes now use it to produce various forms of digital unfreedom that extend into material reality. But how do we ensure that Big Tech contributes to democratic practices rather than political oppression?
Why do tech companies have political responsibility?
In an innovation driven sector like technology, legislation cannot keep pace with new developments. Often, neither users nor makers consider the negative consequences of a new technology until they have experienced them, and the industry is left struggling with the ramifications of harm and, as a consequence, its own expanding responsibilities.
Digital activists from Global Voices Advox report on the growing use of digital technology for advancing authoritarian regimes worldwide, focusing, among others, on issues such as surveillance, mis/disinformation and access to the internet in different contexts. Autocrats use the whole scale of digital technologies available. In Russia, where the interest of the state lies in keeping opposition views from the information environment, there is a strong emphasis on disinformation and censorship. Tanzania and Sudan are known for internet shutdowns, while in Turkey and Morocco, cases of public digital surveillance have become more common.
At the same time, the tech sector does not necessarily play on the dark side only. Since the Russian invasion of Ukraine, Elon Musk’s SpaceX continued to support Starlink and provide internet access in Ukraine after the Russian invasion disrupted services. And yet, his recent purchase of Twitter brought multiple controversies, further empowering the attention economy of social media, which leads to fragmentation, polarisation and the decline of the public sphere. It’s impossible to separate tech companies from politics, and their role tends to cause controversy.
Good apple, bad apple
If you’re reading this text from your MacBook or iPhone, you probably have recognized the difference between living in a new information space with much less targeted advertising. In February 2022, Apple introduced its new privacy features allowing users to enable or block personal data tracing from the apps installed on the company’s devices, an innovation with significant political, social and economic consequences.
It’s crucial to understand the business decision that underpins the ongoing debate on personal data ethics and regulation. Protecting Apple users’ personal data means they will not be targeted with personally crafted advertising, and their data will not be used to predict consumer behaviour, which enables users’ right to privacy — one of the central categories of online service providers’ moral responsibilities and, essentially, a human right. This guarantee of the right attracts consumers to Apple products.
At the same time, this architectural decision caused significant distress to the market, as the stock prices of Meta and other social media companies plunged that day. Introducing an opt-out particularly for personal data collection means shrinking their potential advertising revenues as less data becomes available to develop personalized ads.
Apple made a policy-level decision, a milestone in the discussion on issues of user privacy regulation. Effectively, it is a subject of government concern on the intersection of information and business ethics, law and policy. This case illustrates the power of one company, which can be not just a game changer in the conversation on tech regulation but a shock for the industry, pushing other businesses to shift their business models and challenge the dynamics of Big Tech.
What is this decision for Apple? An enactment of an ethical stand signalling its political responsibility? An act of an excellent corporate citizen innovating to enable its customers’ rights for privacy? Or is it a marketing move to boost the sale of Apple products through engaging in a non-market activity? Regardless of the motivation, we have witnessed a tech company making a political change on an international level, since Apple products are in demand and sold worldwide.
At the same time, the company engages in other activities that may be seen as controversial. Along with other Big Tech companies, Apple increased its lobbying spending in 2022 as businesses face increased pressure from lawmakers raising antitrust concerns to curb the power of tech giants. Meanwhile, stepping outside the liberal democratic political climate, Apple faces decisions that challenge its political stand. In 2021 the company confirmed storing all personal data of Chinese users inside China-based data centres. China is known for using surveillance as a tool for political prosecution. Even though Apple claimed to maintain a high level of security, journalist sources report that the company handed over the keys to the government. The same year, Apple removed a smart voting app, one of the tools developed by the opposition in Russia to outplay electoral fraud. In both cases, the company’s decision-making had severe and direct political consequences, just like the decision to block personal data tracing on its devices. The only difference was the kind of pressure put on a company by the political system it was operating in.
Where does the political responsibility of Big Tech end?
In 2022 the world saw the global expansion of authoritarian rule, affecting developing states and established democracies. According to the 2022 Freedom House report, only 20 percent of the earth’s population live in a free country, while the remaining 80 percent are equally split between a partially free and not free world. The world is getting more authoritarian, and the political regime of a liberal democracy today is the exception rather than the rule.
Different autocracies pose challenging obstacles to tech companies, which remain the key producers of innovative technology. The role of the state defines the potential expectations of business, and their relationship patterns. In autocracies, political participation and public deliberation face repression through state authorities, and business is shaped by a political economy with the elements of state intervention. The state prevails, and it has more direct control over the company when needed, and the interference in economic life is ordinary and unpredictable. Autocrats are famous for censorship, propaganda, and interventions in electoral systems, all of which are delivered by technology provided by business.
One of the most common examples could be the situation in which a business organization has to obey the law of an authoritarian state to maintain political legitimacy, while the law itself may undermine the moral legitimacy of the company. The case of Apple in China is an example of this. However, it can have different consequences for companies in other countries. For instance, Verizon (the subsidiary that bought out Yahoo! in 2017) was sued for handing data to the Chinese government that led to political prosecution and the torture of dissidents. In authoritarian regimes, legislation is often designed to set out the specific requirements and processes for government agencies to obtain access to personal data, including surveillance purposes. Even though data handovers upon the request, e.g., the subpoena, are common for democratic regimes as well, the difference is how such data is further used and whether there are grounds for balancing it out with other institutional procedures.
Elaborating on the political responsibility of Big Tech
As the intersection of technology and politics continues to expand, grappling with the political implications of new creations becomes imperative for tech innovators. They must take proactive steps to develop robust political responsibility strategies while navigating authoritarian and other ethically fraught environments. Transparency is one way to meet these goals.
The practice of environmental social and governance (ESG) reporting and disclosure on ESG issues is an excellent example of how mandated transparency has led to accountability, and one that can be adapted to technological innovation. Openly revealing who has bought a certain technology will limit the ability of authoritarian governments to abuse it, for example. Additionally, integrating political responsibility as a part of responsible investment portfolios could represent a meaningful step forward to starting an open dialogue about tech, politics and society. This could be done by disclosing on direct political engagement of companies and adding additional transparency about contexts in which business operates.
Yet, such openness would be even more problematic — and potentially impossible — for tech companies that have been developed within the borders and hence the jurisdiction of authoritarian regimes. One of the most illustrative examples is the case of Yandex, a multinational company headquartered in Russia. The company has grown into a major tech player, often referred to as the “Russian Google.” Despite making an occasional compromise with the political system, the company kept the reputation of the most liberal company in the country while showing steady business growth.
However, when Russia invaded Ukraine in February 2022, Yandex faced significant pressure, legislative restrictions, international sanctions and criticism from the public. From the first weeks of the war, YandexNews, daily visited by 40 million people, has been indexing only stories from state-owned media, amplifying the narratives of the “special operation.” Abiding by the law became equivalent to contributing to univocal media coverage dominated by the Russian state.
The war became the most significant trigger that affected the company, as the share price of this prominent business lost over 75 percent of its value. Many company employees, including top management, resigned or left the country in protest of the war led by Russia. Personal sanctions were applied to the company’s CEO and founder. Under pressure, the company sold their media assets to a holding loyal to the state. In December, the company’s founder left Yandex Russia but remained the key shareholder.
Scenarios like these establish a controversial ground for businesses that must come to terms with an authoritarian state’s rules to keep their business going. Albert Hirshman’s “Exit, Voice, Loyalty: Responses to Decline in Firms, Organizations and States” suggests a framework of three strategies for responding to the perceived decrease in performance of an organization or a state. Using it as a guide to an organizational strategy, a tech company facing authoritarianism could leave, protest or comply. However, as the suppression of public dissent usually characterizes authoritarianism, realistically, only two strategies are left: to stay or to go.
Nevertheless, both strategies bring further ethical concerns. With a lot said about the downsides of collaborating with autocrats, how ethical is it towards the employees and customers for a business to leave the declining state? Moreover, the business remains a profit-generating enterprise first of all, and very few countries in the world would make a market for a product so the company’s leadership could keep to the standard of political responsibility. We can’t all live in Norway, after all.
As the influence of tech companies continues to grow, it falls to civil society, journalists, tech users, and watchdog organisations to keep these firms accountable. Demanding transparency and collaborating to come up with new fair policies that could support tech companies in tough contexts could be one way forward. Meanwhile, it is important to educate the public and create incentives for consuming tech other than instant gratification. By working together, these stakeholders can start shaping a more ethical tech landscape, where common good carries more weight than corporate interest.
Towards Life 3.0: Ethics and Technology in the 21st Century is a talk series organized and facilitated by Dr. Mathias Risse, Director of the Carr Center for Human Rights Policy, and Berthold Beitz Professor in Human Rights, Global Affairs, and Philosophy. Drawing inspiration from the title of Max Tegmark’s book, Life 3.0: Being Human in the Age of Artificial Intelligence, the series draws upon a range of scholars, technology leaders, and public interest technologists to address the ethical aspects of the long-term impact of artificial intelligence on society and human life.
On 20 April you can join for 45 minutes with WITNESS’ new Executive Director Sam Gregory [see: https://humanrightsdefenders.blog/2023/04/05/sam-gregory-finally-in-the-lead-at-witness/]o n how AI is changing the media and information landscape; the creative opportunities for activists and threats to truth created by synthetic image, video, and audio; and the people and places being impacted but left out of the current conversation.
Sam says “Don’t let the hype-cycle around ChatGPT and Midjourney pull you into panic, WITNESS has been preparing for this moment for the past decade with foundational research and global advocacy on synthetic and manipulated media. Through structured work with human rights defenders, journalists, and technologists on four continents, we’ve identified the most pressing concerns posed by these emerging technologies and concrete recommendations on what we must do now.
We have been listening to critical voices around the globe to anticipate and design thoughtful responses to the impact of deepfakes and generative AI on our ability to discern the truth. WITNESS has proactively worked on responsible practices for synthetic media as a part of the Partnership on AI and helped develop technical standards to understand media origins and edits with the C2PA. We have directly influenced standards for authenticity infrastructure and continue to forcefully advocate for centering equity and human rights concerns in the development of detection technologies. We are convening with the people in our communities who have most to gain and lose from these technologies to hear what they want and need, most recently in Kenya at the #GenAIAfrica convening”.
Jon Stone in the Independent of 13 july 2020 wrote about the UK Government being urged to explain £75m exports to countries rated ‘not free’. The British government is providing more than a dozen repressive regimes around the world with wiretaps, spyware and other telecommunications interception equipment they could use to spy on dissidents, public records show. Despite rules saying the UK should not export security goods to countries that might use them for internal repression, ministers have signed off more than £75m in such exports over the past five years to states rated “not free” by the NGO Freedom House.
The 17 countries include China, Saudi Arabia and Bahrain, as well as the United Arab Emirates, which was the biggest recipient of licences totalling £11.5m alone since 2015….One such beneficiary of the UK’s exports is Hong Kong, which had a £2m shipment approved last year despite ongoing repression of pro-democracy protests. The Philippines, where police extrajudicial killings are rampant, has also provided steady business for British firms hawking surveillance systems.,,
A government spokesperson said blandly : “The government takes its export responsibilities seriously and assesses all export licences in accordance with strict licensing criteria. We will not issue any export licences where to do so would be inconsistent with these criteria.” But Oliver Feeley-Sprague, Amnesty International UK’s programme director for military, security and police affairs, said the UK did not seem to be undertaking proper risk assessments when selling such equipment and said the government’s controls were becoming “notorious” for their “faulty decision-making”…
“With numerous human rights defenders arrested and jailed in countries like Saudi Arabia, the UAE and Turkey in the past five years, there’s a greater need than ever for the UK to be absolutely scrupulous in assessing the risk of UK telecoms technology being used unlawfully against human rights activists, journalists, and peaceful opposition figures.
“It’s just not clear that the UK is undertaking proper risk assessments when selling this equipment, and it’s not clear whether UK officials are making any effort to track how the equipment is used in one, two or three years’ time.
This week international trade secretary Liz Truss announced the UK would be resuming arms exports to Saudi Arabia, after a court had previously ordered that they were suspended. The government said it had reviewed claims that Saudi forces in Yemen had breached international humanitarian law and said any possible breaches were “isolated incidents” because they had happened in different places and different ways.
Andrew Smith of Campaign Against Arms Trade said the sale of the spying equipment raised “serious questions and concerns”.
As the Universal Declaration of Human Rights turns 70 – is it time for a new approach? asks Barbara von Ow-Freytag, Journalist, political scientist and adviser, Prague Civil Society Centre, in the World Economic Forum.This piece is certainly worth reading as a whole. It is close to my heart in that it stresses the need to have a hard look at how young human rights defenders focus their energy where they can achieve real, concrete change within their own communities. Their campaigns are grassroots-led and use local languages and issues their communities understand. They often use technology and creative formats, with a heavy dose of visual and artistic elements. Where the international scene seems to stagnate and even backpedal, better use of communication skills and tools (such as images) are certainly part of the answer:
As the Universal Declaration of Human Rights turns 70, a new generation of human rights defenders are reinventing themselves to fight for old rights amid a new world order. Based not on declarations, charters and international bodies, but on the values which underpin them – justice, fairness, equality – they shun the language of their predecessors while embracing the same struggle…However, in the new realities of the 21st century, the mechanisms to promote human rights that grew out of the Universal Declaration are showing their age. Authoritarianism is on the rise across the world, with popular leaders cracking down on human rights defenders.
Freedom House found 2018 was the 12th consecutive year that the world became less free. Civicus, which specifically monitors the conditions for civil society activists and human rights defenders, found civil society was “under attack” in more countries than it wasn’t, with all post-Soviet countries (except Georgia) ranging between “obstructed” and “closed”.
Troublingly, both the willingness and the ability of Western bastions of human rights are also on the wane. Inside the EU, talk of illiberal democracy gains traction, and internal crises divert attention away from the global stage. Perhaps unsurprisingly, throughout Eastern Europe and the former Soviet Union, younger activists and civil society are giving up on western governments and international organizations to advocate on their behalf. Pavel Chikov, director of the Agora group, said recently that, “Russian human rights groups no longer have a role model,” calling the liberal human rights agenda “obsolete”.
Growing disillusionment has led many rights groups to shift away from appealing to outsiders for support. Younger campaigners no longer frame their work in the traditional language of human rights, and many do not even consider themselves human rights defenders. Instead of referring to international agreements violated, they focus on solving practical problems, or creating their own opportunities to advance values of equality, justice and fairness.
Formats too have changed. Throughout the region, tools used by civil society to raise social consciousness are becoming diverse, dynamic and smart. Instead of one-person legal tour de forces, genuinely grassroots, tech-powered, peer-to-peer or horizontal networks are proving effective. Media, music, art, film, innovative street protests, urbanism and online initiatives focused on local communities are coming to replace petitions and international advocacy.
Team 29, an association of Russian human rights lawyers and journalists, is among the most successful of this new generation. It has repositioned itself as part-legal aid provider, part-media outlet. Its website offers a new mix of news on ongoing trials, animated online handbooks for protesters, videos on torture and a new interactive game telling young people how to behave if they are detained by police.
What may look like PR-friendly add-ons are actually core to their operation. Anastasia Andreeva, the team’s media expert, says: “Before, we consulted some 30 clients, now we reach tens of thousands of people.”
Azerbaijani activist Emin Milli also embodies this journey of wider civil society – turning away from the international towards local solutions. In the early 2000s, he was a traditional human rights defender, successfully using international mechanisms, such as the Council of Europe to assist political prisoners.
The key to Meydan’s success is its accessibility. Milli says: “We do stories about ordinary people. Real Azeris who have everyday problems.” Through its smart coverage, investigating and highlighting how injustice affects these ordinary people, and not referring to UN-enshrined rights and responsibilities, Meydan is “giving a voice to people who fight for women’s rights, people who fight for political rights, for civil liberties, and everybody who feels they are voiceless”.
Music, too, is increasingly being used as a vehicle to realize human rights. Though he might shun the label, Azeri rapper Jamal Ali is perhaps one of the country’s most well-known “human rights defenders”. His songs about injustice and corruption regularly go viral, raising national and international awareness in the same way a statement at the UN General Assembly might have done three decades ago.
In a 2017 hit, he highlighted how two young men had been tortured by police and faced 10 years in prison for spraying graffiti on a statue of former president Heydar Aliyev. In response, the regime arrested Ali’s mother, demanding that he remove the video from YouTube, only to ensure that Ali’s song went even more viral among Azeri youngsters.
Gender equality and women’s rights is also being advanced through unexpected new champions. In Kyrgyzstan, 20-year-old singer Zere Asylbek sparked a feminist shockwave earlier this year with her video Kyz (“Girl”). “Don’t tell me what to wear, don’t tell me how to behave,” she sings, bearing her top to reveal her bra. Seen by millions, the Kyrgyz-language feminist anthem has set off a new #MeToo debate in the Central Asian country, where many young women are still abducted, raped and forced to marry.
In the wake of the video, a first “feminist bar” is about to open in Bishkek. Other feminist videos have been used to directly address the issue of bride-kidnapping, with animated cartoons being used as part of local campaigns to change mindsets in a conservative society.
Perhaps most excitingly, an all-female team of 18 to 20-year-olds is building the country’s first micro-satellite. “Girls taking us into space is the best message against sexism,” says Bektour Iskender, whose news site Kloop initiated the project. He says the girls’ project has a deep social mission, promoting national pride and the country’s return to advanced technological development.
These examples – and countless more – show that civic groups see no value in lobbying an increasingly disinterested West and sluggish international organizations. Instead they focus their energy where they can achieve real, concrete change within their own communities. Their campaigns are grassroots-led and use local languages and issues their communities understand. They target specific audiences, often using technology and creative formats, with a heavy dose of visual and artistic elements.
Addressing discrimination, environmental protection, corruption, health issues, women’s rights, they speak not about the failure of their states to abide by international accords, but about common dignity and life opportunities, addressing people on a direct human level.
Clearly, the values of the Universal Declaration of Human Rights are still valid, but their approach and the packaging have changed. “We all want to change the world,” says Sergey Karpov of the Russian online media and philanthropic platform Takie Dela. “Today communications are the best way”.
Yesterday (18 June 2015) Amnesty Internationalannounced something that is (rather will be) something new in human rights education: a series of Massive Open Online Courses (MOOCs). Who knows, the horrible acronym may one day be as normal as HRDs or AI itself. For this to come about Amnesty International is partnering with edX, a global leader in online education founded by Harvard University and MIT. The first MOOCs will be available later this year. The free online courses will be designed by human rights and education experts from across Amnesty International.
On 1 December 2014 a group of 7 NGOs (Amnesty International, Digitale Gesellschaft, International Federation for Human Rights, Human Rights Watch, Open Technology Institute (at New America), Privacy International, Reporters sans frontieres) sent an Open Letter to the “Wassenaar Arrangement” (for what this is see link at the end). The key issue is that the alarming proliferation of surveillance technologies available to repressive countries adversely affects political activists, human rights defenders, refugees, dissidents and journalists.
Here is the text of the letter:
“We, the undersigned organisations, call upon the 41 Governments that compose the Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies, to take action and address the alarming proliferation of surveillance technologies available to repressive countries involved in committing systematic human rights violations. This trade results in unlawful surveillance, which often leads to further human rights violations including invasions of privacy, arbitrary arrest and detention, torture and other cruel, inhuman or degrading treatment or punishment, the silencing of free expression, preventing political participation, and crushing offline and online dissent.
Surveillance technologies are not simply harmless tools. In the wrong hands they are often used as a tool of repression. Evidence is continuing to reveal the extent of this secretive trade that puts countless individuals at direct risk from human rights abusing governments. More and more stories emerge showing these damaging and often unlawful technologies affecting political activists, human rights defenders, refugees, dissidents and journalists, with some technologies placing entire populations under surveillance. Governments with internationally condemned human rights records such as Bahrain, Ethiopia, Egypt, Turkmenistan, Libya, Syria and Iran have all purchased surveillance technologies from private companies, and have used them to facilitate a variety of human rights violations. Some revelations in France, Germany, the UK, and the US have led to police and judicial investigations following calls from NGOs and members of the Coalition Against Unlawful Surveillance Exports. Remarkably and despite mounting evidence of associated abuses, surveillance technology companies still openly market their products at ‘trade fairs’ across the UK, France, US, Brazil and the UAE among other countries.
Although steps were taken in 2013 to address this largely unregulated global market, governments cannot let the momentum halt. Governments have now included additional technologies associated with intrusion software and IP monitoring to the Lists of Dual Use Goods and Technologies and Munitions, and are aware of the impact surveillance technologies can have on human rights. There is now a pressing need to modernise out of date export controls. In addition, technologies such as undersea fibre-optic cable taps, monitoring centres, and mass voice / speaker recognition technologies urgently need to be examined for their impact on human rights and internal repression, particularly when the end user is a government known for committing human rights violations. Technologies evolve at a rapid pace and governments that abuse human rights take advantage of weak regulation, the product of poor understanding of the technologies and their capabilities.
In the current system, human rights and digital rights groups, as well as external independent experts, are excluded from contributing their expertise and knowledge to the Wassenaar Arrangement forum. The additional expertise and knowledge that civil society can bring to the debate is invaluable to this end. Discussions should not continue in a closed-forum manner and we urge governments to engage with civil society organisations to help ensure that accurate and effective controls are developed which reflect modern technological developments and do not impede legitimate scientific and security research.
Any export policy relating to surveillance technologies should place human rights at its heart. Governments must exercise a strict policy of restraint and should refuse to grant export licenses for surveillance technology destined for end-users in countries where they are likely to be used in an unlawful manner i.e. not compliant with human rights legal standards. Governments should consider the weakness or absence of an appropriate legal framework in the recipient country to ensure the transfer would not pose a substantial risk of the items being used to violate or abuse human rights. Governments should also be transparent in what they export, and to whom and support the development of an international legal framework to address the sale and trade of surveillance technologies.”
The Wassenaar Arrangement (41 participating States) has been established in order to contribute to regional and international security and stability, by promoting transparency and greater responsibility in transfers of conventional arms and dual-use goods and technologies, thus preventing destabilising accumulations. Participating States seek, through their national policies, to ensure that transfers of these items do not contribute to the development or enhancement of military capabilities which undermine these goals, and are not diverted to support such capabilities.
On 20 November 2014 Amnesty International launched a new tool that human rights defenders can use in their struggle against surveillance. It is called: DETEKT. As I have often expressed concern about digital security in this blog (see: https://thoolen.wordpress.com/tag/digital-security/\) here ARE major excerpts from the Questions and Answers that were provided in the press release:
What is Detekt and how does it work?
Detektis a free tool that scans your computer for traces of known surveillance spyware used by governments to target and monitor human rights defenders and journalists around the world. By alerting them to the fact that they are being spied on, they will have the opportunity to take precautions.
It was developed by security researchers and has been used to assist in Citizen Lab’s investigations into government use of spyware against human rights defenders, journalists and activists as well as by security trainers to educate on the nature of targeted surveillance. Amnesty International is partnering with Privacy International, Digitale Gesellschaft and the Electronic Frontier Foundation.
Why are you launching Detekt now?
The latest technologies enable governments to track, monitor and spy on people’s activities like never before. Through the use of these technologies, governments can read private correspondence and even turn on the camera and microphone of a computer without its owner knowing it. Our ultimate aim is for human rights defenders, journalists and civil society groups to be able to carry out their legitimate work without fear of surveillance, harassment, intimidation, arrest or torture.
Has anyone used Detekt successfully to know if they were being spied on?
Detekt was developed by researchers affiliated with the Citizen Lab, who used a preliminary version of the tool during the course of their investigations into the use of unlawful surveillance equipment against human rights defenders in various countries around the world.
For example, according to research carried out by Citizen Lab and information published by Wikileaks, FinSpy – a spyware developed by FinFisher, a German firm that used to be part of UK-based Gamma International– was used to spy on prominent human rights lawyers and activists in Bahrain.
How effective is this tool against technologies developed by powerful companies?
Detekt is a very useful tool that can uncover the presence of some commonly used spyware on a computer, however it cannot detect all surveillance software. In addition, companies that develop the spyware will probably react fast to update their products to ensure they avoid detection. This is why we are encouraging security researchers in the open-source community to help the organizations behind this project to identify additional spyware or new versions to help Detekt keep up to date.
It is important to underline that if Detekt does not find trace of spyware on a computer, it does not necessarily mean that none is present. Rather than provide a conclusive guarantee to activists that their computer is infected, our hope is that Detekt will help raise awareness of the use of such spyware by governments and will make activists more vigilant to this threat.
In addition, by raising awareness with governments and the public, we will be increasing pressure for more stringent export controls to ensure that such spyware is not sold to governments who are known to use these technologies to commit human rights violations.
How widely do governments use surveillance technology?
Governments are increasingly using surveillance technology, and targeted surveillance in particular, to monitor the legitimate activities of human rights activists and journalists. Powerful software developed by companies allows governments and intelligence agencies to read personal emails, listen-in on Skype conversations or even remotely turn on a computers camera and microphone without its owner knowing about it. In many cases, the information they gather through those means is used to detain, imprison and even torture activists into confessing to crimes.
How big is the unregulated trade in surveillance equipment? What are the main companies and countries involved?
The global surveillance industry is estimated to be worth approximately US$5 billion a year – with profits growing 20 per cent every year. European and American companies have been quietly selling surveillance equipment and software to countries across the world that persistently commit serious human rights violations. Industry self-regulation has failed, and government oversight has now become an urgent necessity.
Privacy International has extensively documented the development, sale and export of surveillance technologies by private companies to regimes around the world. Recipient countries include: Bahrain, Bangladesh, Egypt, Ethiopia, Libya, Morocco, South Africa, Syria and Turkmenistan.
Isn’t publicizing the existence of this tool giving governments a heads up about how they can avoid being caught (by adapting new equipment which avoids detection)?
The technologies that allow governments to efficiently and covertly monitor the digital communications of their citizens are continuously improving. This is happening across the world. The growing trend in indiscriminate mass surveillance on a global scale was laid bare by the Edward Snowden disclosures. In addition to mass surveillance technologies, many governments are using sophisticated tools to target specific human rights defenders and journalists who work to uncover abuses and injustice. The new spyware being developed and used is powerful and dangerous and putting many human rights activists and journalists at risk of abuse.
As surveillance technologies develop in sophistication, it is vital that civil society groups learn how to protect their digital communications. No one tool or intervention will be enough to do this. We hope Detekt will become a new approach for investigating surveillance while sensitizing people to the threats.
However, long term we must also demand that governments live up to their existing commitments to human rights and that they and companies put in place stronger protections to ensure that new technologies are not used to violate human rights.
Surveillance is also used to carry out legitimate criminal investigations, why are you against it?
Targeted surveillance is only justifiable when it occurs based on reasonable suspicion, in accordance with the law, is strictly necessary to meet a legitimate aim (such as protecting national security or combatting serious crime and is conducted in a manner that is proportionate to that aim and non-discriminatory.
Indiscriminate mass surveillance – the widespread and bulk interception of communication data that is not targeted or based on reasonable suspicion – is never justifiable. It interferes with a range of human rights, particularly the rights to privacy and freedom of expression.
The Detekt tool can be downloaded from: Github page.
Today, 23 June 2014, Amnesty International launches its open source ‘Panic Button’ app to help human rights defenders facing imminent danger. The aim is to increase protection for those who face the threat of arrest, attack, kidnap and torture. In short: