Posts Tagged ‘privacy’

Norway’s NGOs furious about Telenor’s data ending up in the hands of Myanmar’s junta

March 31, 2022
Former Minister of Trade and Industry Monica Mæland visiting Myanmar in 2014. Photo: Trond Viken, Ministry of Trade and Industry

On 25 March, Telenor announced that the telecom giant had transferred the operational activities of Telenor Myanmar to M1 Group. [see: https://humanrightsdefenders.blog/2021/10/26/norways-telenor-in-myanmar-should-do-more-than-pull-out-it-should-not-hand-sensitive-data-to-the-regime/] In a release following the announcement, the Norwegian Forum for Development and Environment (ForUM) condemns the sale, and Kathrine Sund-Henriksen, ForUM’s general manager calls it a dark day for Telenor and for Norway as a human rights nation.

ForUM is a network of 50 Norwegian organizations within the development, environment, peace, and human rights with a vision of a democratic and peaceful world based on fair distribution, solidarity, human rights, and sustainability. ForUM writes that together with transferring the operational activities of Telenor Myanmar to M1 Group, Telenor also sells sensitive personal data of 18 million former Telenor customers, and there is an imminent danger that this information will soon be in the hands of the country’s brutal military dictatorship. ForUM is furious at the news that the sale has been completed.

Ever since the sale was announced last summer, we have worked to prevent it because there is a big risk that the military junta will have access to sensitive personal information and use it to persecute, torture, and kill regime critics. Incredibly, Telenor is going through with a sale that has been criticized by human rights experts, civil society, Myanmar’s government in exile, and even their own employees in the country,” says Kathrine Sund-Henriksen.

Telenor has admitted that since October last year they have known that the junta uses the M1 Group as an intermediary and that the data will soon end up in the hands of Shwe Byain Phy Group, a local conglomerate with close ties to the junta. Kathrine Sund-Henriksen believes it is only a matter of time before the sale has tragic consequences for human rights activists in the country.

When metadata is transferred, the junta will be able to know who a user has called, how long the call has lasted, and where the call was made. All of this can be used to expose activist groups operating in secret for the junta. According to the UN, the junta has killed more than 1,600 people and more than 12,000 have been arrested since last year’s coup. Those numbers will continue to increase, and Telenor has given the junta all the information they need to expose human rights defenders,” Kathrine Sund-Henriksen says.

https://www.forumfor.no/nyheter/2022/forum-for-utvikling-og-miljo-fordommer-salget-av-telenor-myanmar

Frontline’s Guide to Secure Group Chat and Conferencing Tools

July 21, 2020

With teams increasingly working remotely during COVID-19, we are all facing questions regarding the security of our communication with one another: Which communication platform or tool is best to use? Which is the most secure for holding sensitive internal meetings? Which will have adequate features for online training sessions or remote courses without compromising the privacy and security of participants?

Front Line Defenders presents this simple overview which may help you choose the right tool for your specific needs.

FLD Secure Group Chat Flowchart

Download PDF of the flow chart

Note:

  • With end-to-end encryption (e2ee), your message gets encrypted before it leaves your device and only gets decrypted when it reaches the intended recipient’s device. Using e2ee is important if you plan to transmit sensitive communication, such as during internal team or partners meetings.
  • With encryption to-server, your message gets encrypted before it leaves your device, but is being decrypted on the server, processed, and encrypted again before being sent to recipient(s). Having encryption to-server is OK if you fully trust the server.

Why Zoom or other platforms/tools are not listed here: There are many platforms which can be used for group communication. In this guide we focused on those we think will deliver good user experiences and offer the best privacy and security features. Of course none of the platforms can offer 100% privacy or security as in all communications, there is a margin of risk. We have not included tools such as Zoom, Skype, Telegram etc. in this guide, as we believe that the margin of risk incurred whilst using them is too wide, and therefore Front Line Defenders does not feel comfortable recommending them.

Surveillance and behaviour: Some companies like Facebook, Google, Apple and others regularly collect, analyse and monetize information about users and their online activities. Most, if not all, of us are already profiled by these companies to some extent. If the communication is encrypted to-server owners of the platform may store this communication. Even with end-to-end encryption, communication practices such as location, time, whom you connect with, how often, etc. may still be stored. If you are uncomfortable with this data being collected, stored and shared, we recommended refraining from using services by those companies.

The level of protection of your call depends not only on which platform you choose, but also on the physical security of the space you and others on the call are in and the digital protection of the devices you and others use for the call.

See also:

Caution: Use of encryption is illegal in some countries. You should understand and consider the law in your country before deciding on using any of the tools mentioned in this guide.

Criteria for selecting the tools or platforms

Before selecting any communication platform, app or program it is always strongly recommended that you research it first. Below we list some important questions to consider:

  • Is the platform mature enough? How long has it been running for? Is it still being actively developed? Does it have a large community of active developers? How many active users does it have?
  • Does the platform provide encryption? Is it end-to-end encrypted or just to-server encrypted?
  • In which jurisdiction is the owner of the platform and where are servers located? Does this pose a potential challenge for your or your partners?
  • Does the platform allow for self-hosting?
  • Is the platform open source? Does it provide source code to anyone to inspect?
  • Was the platform independently audited? When was the last audit? What do experts say about the platform?
  • What is the history of the development and ownership of the platform? Have there been any security challenges? How have the owners and developers reacted to those challenges?
  • How do you connect with others? Do you need to provide phone number, email or nickname? Do you need to install a dedicated app/program? What will this app/program have access to on your device? Is it your address book, location, mic, camera, etc.?
  • What is stored on the server? What does the platform’s owner have access to?
  • Does the platform have features needed for the specific task/s you require?
  • Is the platform affordable? This needs to include potential subscription fees, learning and implementing, and possible IT support needed, hosting costs, etc.

The document then proceeds to give more detailed information related to each tool/service listed in this guide

Signal – https://signal.org/

Delta Chat – https://delta.chat/

Wire – https://wire.com/

Jitsi Meet – https://jitsi.org/jitsi-meet/

BigBlueButton – https://bigbluebutton.org/

Whereby – https://whereby.com

Blue Jeans – https://www.bluejeans.com/

GoToMeeting – https://www.gotomeeting.com/

Facetime / iMessage –https://www.apple.com/ios/facetime

Google Meet – https://meet.google.com/

Duo – https://duo.google.com/

WhatsApp – https://www.whatsapp.com/

Video calls, webinar or online training recommendations

Video calls recommendations: In the current situation you will undoubtedly find yourself organizing or participating in many more video calls than before. It may not be obvious to everyone how to do it securely and without exposing yourself and your data to too much risk:

  • Assume that when you connect to talk your camera and microphone may be turned on by default. Consider covering your camera with a sticker (making sure it doesn’t leave any sticky residue on the camera lens) and only remove it when you use the camera.
  • You may not want to give away too much information on your house, family pictures, notes on the walls or boards, etc. Be mindful of the background, who and what is also in the frame aside from yourself? Test before the call by, for example, opening meet.jit.si and click on GO button to get to a random empty room with your camera switched on to see what is in the picture. Consider clearing your background of clutter.
  • Also be mindful who can be heard in the background. Maybe close the door and windows, or alert those sharing your space about your meeting.
  • Video call services may collect information on your location and activity, consider using a VPN (see Physical, emotional and digital protection while using home as office in times of COVID-19 guide).
  • It is best to position your face so your eyes are more or less at the upper third of the picture without cutting off your head. Unless you do not want to reveal your face, do not sit with your back to a light or a window. Daylight or a lamp from the front is the best. Stay within the camera frame. You may want to look into the lens from time to time to make “eye contact” with others. If you are using your cellphone, rest it against a steady object (e.g. a pile of books) so that the video picture remains stable.
  • You may want to mute your microphone to prevent others hearing you typing notes or any background noise as it can be very distracting to others on the call.
  • If the internet connection is slow you may want to switch off your camera, pause other programs, mute the microphone and ask others to do same. You may also want to try sitting closer to the router, or connecting your computer directly to the router with an ethernet cable. If you share internet connection with others, you may ask them to reduce extensive use of internet for the duration of your call.
  • It it very tempting to multitask especially during group calls. But you may very soon realise that you are lost in the meeting and others may realize this.
  • If this is a new situation for you or you are using a new calling tool, you may want to give yourself a few extra minutes to learn and test it prior to the scheduled meeting to get familiar with options like turning on/off the camera and the microphone, etc.
  • If possible, prepare and test a backup communication plan in case you will have trouble connecting with others. For example, adding them to a Signal group so you can still text chat or troubleshoot problems on the call. Sometimes it helps to have an alternate browser installed on your computer or app on the phone to try connecting with those.

If you would like to organise a webinar or online training, you can use tools outlined above in the group communication. Some of best practices include:

  • Make sure that you know who is connected. If this is needed check the identities of all people participating by asking them to speak. Do not assume you know who is connected only by reading assigned names.
  • Agree on ground-rules, like keeping cameras on/off, keeping microphone on/off when one is not speaking, flagging when participants would like to speak, who will be chairing the meeting, who will take notes – where and how will those notes be written and then distributed, is it ok to take screenshots of a video call, is it ok to record the call, etc.
  • Agree on clear agendas and time schedules. If your webinar is longer than one hour, it is probably best to divide it into clear one-hour sessions separated by some time agreed with participants, so they have time to have a short break. Plan for the possibility that not all participants will return after a break. Have alternative methods to reach out to them to remind them to return, like Signal/Wire/DeltaChat contacts for them.
  • It is easiest to use a meeting service that participants connect to using a browser without a need to register or install a special program, one that also gives the webinar organiser the ability to mute microphones and close cameras of participants.
  • Prior to the call, check with all participants whether they have particular needs, such as if they are deaf or hard of hearing, if they are visually impaired or blind, or any other conditions which would affect their participation in the call. With this in mind, ensure that the selected platform will accommodate these needs and to be sure, test the platform beforehand. Simple measures can also improve inclusion and participation in your calls, such as turning on cameras when possible, as it can allow for lip-reading.
  • Encourage all participants to speak slowly and to avoid jargon where possible, as the working language of the call is most likely not everyone’s mother tongue language. Naturally, there will be moments of silences and pauses, embrace them. They can help to support understanding and can be helpful for participants who are hard of hearing, interpreters and will also aid assistive technology to pick up words correctly.

https://www.frontlinedefenders.org/en/resource-publication/guide-secure-group-chat-and-conferencing-tools

Policy response from Human Rights NGOs to COVID-19: RFK

April 3, 2020

In the midst of the COVID-19 crisis, many human rights organisations have been formulating a policy response. While I cannot be complete or undertake comparisons, I will try and give some examples in the course of the coming weeks. Here the one by Kerry Kennedy of

 

 

 

 

…Nearly 52 years later, it is just as imperative that we take to heart his message to “remember those who live with us,” that our societal response to the coronavirus pandemic be tethered to the same strong sense of equity and social justice of which my father spoke.

In the midst of this global pandemic, that means:

Remembering the most vulnerable—those without a stable or permanent home, those with disabilities, and those without a safety net who have no ability to work from the shelter of their homes or take time off, by ensuring that everyone has access to adequate, affordable healthcare. Those of us who can afford to stock our pantries with reserves must not hoard, instead ensuring that local food depositories and soup kitchens are sufficiently funded and supplied.

Remembering the prisoners—who are unable to practice social distancing to prevent the spread of illness. At Robert F. Kennedy Human Rights, we echo the increasing calls to release people being detained pretrial and in immigration detention, starting with the most vulnerable, to ease spread of the virus in crowded, unjust lockups. Jail and immigration detention should never equate to death sentences, and we hope that the current public health crisis will help us see with new eyes how these systems of mass human caging are and have always been so incredibly cruel, dangerous, violent, and unnecessary.

Remembering the truth tellers—as national governments increasingly declare states of emergency to bolster their responses to the pandemic and save lives, we must keep a watchful eye, given the rise of authoritarianism, to ensure that civic space is protected. Governments around the world have made a practice of using such states of emergency to curtail the legitimate activity of human rights defenders. These actions, such as China’s mandate that citizens carry cell phones so they can be constantly tracked, followed by Israel’s announcement that its citizens must do the same, must comply with international law mandating timeliness and sunset clauses, proportionality and nondiscrimination.

Remembering the first responders—our public health officials, the workers stocking the shelves of our grocery stores, and all others who are ensuring that our basic needs are met are putting their lives on the line. The government must do its utmost to make sure that these human rights defenders are armed with necessary resources and protections, including economic security, to stem the outbreak and stay safe.

International human rights law offers us a blueprint for action, reminding us that all citizens of the world have inalienable rights—no matter their race, gender, background, income level, or sexual orientation.

….We are all facing this unprecedented crisis together.see also: https://humanrightsdefenders.blog/2020/03/27/covid-19-spread-leads-to-reactions-and-messages-of-solidarity/

https://rfkhumanrights.org/news/coronavirus-statement

Jigsaw designed software (“Outline”) for self-controlled VPNs

March 21, 2018
HOTLITTLEPOTATO

A VIRTUAL PRIVATE NETWORK (VPN), that core privacy tool that encrypts your internet traffic and bounces it through a faraway server, has always presented a paradox: Sure, it helps you hide from some forms of surveillance, like your internet service provider’s snooping and eavesdroppers on your local network. But it leaves you vulnerable to a different, equally powerful spy: Whoever controls the VPN server you’re routing all your traffic through.

To help solve that quagmire, Jigsaw, the Alphabet-owned Google sibling that serves as a human rights-focused tech incubator, will now offer VPN software that you can easily set up on your own server—or at least, one you set up yourself, and control in the cloud. And unlike older homebrew VPN code, Jigsaw says it’s focused on making the setup and hosting of that server simple enough that even small, less savvy organizations or even individual users can do it in minutes.

Jigsaw says that the free DIY proxy software, called Outline, aims to provide an alternative to, on the one hand, stronger anonymity tools like Tor that slow down web browsing by bouncing connections through multiple encrypted hops around the world and, on the other hand, commercial VPNs that can be expensive, and also put users’ private information and internet history at risk.

The core of the product is that people can run their own VPN,” says Santiago Andrigo, the Jigsaw product manager who led Outline’s development. “You get the reassurance that no one else has your data, and you can rest easier in that knowledge.”

..A Swedish NGO, Civil Rights Defenders, has been testing Outline since last fall with the group of sensitive internet users it works to protect, who include journalists, lawyers, human rights defenders and LGBT communities in 18 repressive regimes around the world. ..

https://www.wired.com/story/alphabet-outline-vpn-software/

https://www.androidauthority.com/outline-censorship-vpn-847999/

see also: https://humanrightsdefenders.blog/2017/01/10/security-without-borders-offers-free-security-help-to-human-rights-defenders/

About the CLOUD Act and lists of ‘safe countries’

March 17, 2018
For the weekend two long pieces (copied below in full) about a seemingly technical issue but one that could have big consequences for human rights defenders. The key issue is that foreign government who wanted to obtain information on a social media user from a US tech company (such as Microsoft, Google, Apple and Facebook) had to go through a cumbersome procedure using diplomatic procedures (MLATs, – mutual legal assistance treaties). The draft CLOUD Act (Clarifying Lawful Overseas Use of Data) proposes to make it easier for governments to get these data directly from the companies – and here is the tricky part – as long as these foreign government are on a kind of ‘safe list’ with regard to human rights. And that is where the questions come in according to the specialists below. And there are quite a few other worries.

Human Rights Groups Denounce Proposed Global Data Sharing

(CN) – With a pleasant-sounding name and acronym, the CLOUD Act stands for Clarifying Lawful Overseas Use of Data, but human rights groups take a far less sunny view of the bill than the tech giants pushing for its passage through Congress.

Possibly heading to Capitol Hill next week, Microsoft, Google, Apple and Facebook have lined up behind the legislation that overhauls how tech companies share data with foreign governments without notification or oversight.

Amnesty International’s U.S. director Naureen Shah depicted the legislation as a dystopic threat to human rights and press freedom globally while explaining her “grave misgivings” with the bill.

“The CLOUD Act jeopardizes the lives and safety of thousands of human rights defenders around the world at a time when they face unprecedented threats, intimidation and persecution, as we have documented in recent years,” Shah told reporters at a press conference on Thursday.

The CLOUD Act’s proponents and critics agree that the bill arose from the need to plug a gap in domestic and international law.

For decades, foreign governments requesting information from a U.S. company would have to work through diplomatic procedures known as MLATs, short for mutual legal assistance treaties.

“This process – from a privacy and human rights standpoint – is fairly rights-respecting,” the American Civil Liberties Union’s counsel Neema Singh Guliani said at a press conference from Washington.

For U.S. and foreign prosecutors, the MLAT process is cumbersome and gives the targets of criminal investigations cover to hide incriminating data in servers abroad.

This controversy came to a head in 2013, when New York federal prosecutors sought to circumvent the process to obtain emails of a target of a drug-trafficking investigation held on Microsoft’s servers in Dublin, Ireland. Microsoft went to court to protect the privacy of its users, waging a protracted legal battle currently pending before the U.S. Supreme Court.

Perhaps unwilling to gamble on Supreme Court victory, Microsoft and other companies have backed the CLOUD Act as an alternative.

“One of the things the bill would do is that it would moot the Microsoft Ireland case,” the ACLU’s Guliani noted.

For rights groups, however, Congress’ solution would be worse than the problem. The CLOUD Act lets countries that pass unspecified human rights vetting bypass government vetting and work directly with tech companies for information requests.

“We’re essentially relying on tech companies to be a kind of failsafe,” Shah told reporters.

Once a foreign government is safe-listed, Shah said, that nation can freely request information held by tech companies without congressional oversight for any particular request for five years.

That remains true even if a foreign government’s human rights record undergoes a dramatic decline during those years, as happened in Turkey over the last half decade.

“That’s a problem because we see governments around the world in a human rights freefall,” Shah noted.

Amnesty International has unique insight into that danger: The Turkish government jailed its Turkey chair Taner Kilic in an ongoing crackdown on journalists, human rights workers, and other critical voices that country has targeted in the wake of a coup attempt against its President Recep Tayyip Erdogan.

“If you had looked at Turkey in 2012 or 2013, and matched it against the criteria in this bill, Turkey might have passed muster,” Shah said. “Of course, we know that especially since the coup in mid-2016, Turkey has become the world’s largest jailer of journalists.”

“More than 50,000 people at this point in Turkey have been swept up in their crackdown, including the chair and the director of Amnesty International, who were held, one of whom remains in prison, both of whom are being charged with terrorism offenses,” she added.

Under the CLOUD Act, Shah said, Congress would not be able to intervene if a safe-listed nation followed Turkey’s path.

Should that system fail, it is unclear that either the target of a foreign government’s investigation or the U.S. government would even know it.

The CLOUD Act offers the promise of subjecting governments to compliance reviews, but Guliani, the ACLU’s counsel, called this measure meaningless without individualized notice to users or the federal government.

“How can there be real compliance reviews if the U.S. government isn’t getting notice of individual requests?” she asked.

Guliani added that the CLOUD Act would also enable other governments to circumvent Wiretap Act restrictions against real-time interception.

Opposition from civil society groups has kicked into high gear out of fears that the CLOUD Act may get attached to an omnibus budget bill heading next week to Congress.

Joining the ACLU and Amnesty International, a coalition of 22 other groups signed a letter to elected representatives last week stating: “We urge you to oppose the CLOUD Act, and efforts to attach it to other pieces of legislation.”

As the omnibus budget has not yet been released, it is unclear whether that fear will come to pass.

—–

The CLOUD Act Doesn’t Help Privacy and Human Rights: It Hurts Them

By Neema Singh Guliani, Naureen Shah

Friday, March 16, 2018

At a time when human rights activists, dissidents and journalists around the world face unprecedented attacks, we cannot afford to weaken our commitment to human rights. But the recently introduced CLOUD Act would do just that.

The bill purports to address complaints that current mechanisms for foreign governments to obtain data from U.S. technology companies are slow, requiring review by the Justice Department and a warrant issued by a U.S. judge pursuant to the mutual legal assistance (MLA) process. The solution it proposes, however, is a dangerous abdication of responsibility by the U.S. government and technology companies.

Writing on Lawfare, Peter Swire and Jennifer Daskal have penned a the CLOUD Act, arguing that things don’t work well now, that they could get worse and that this is the best option on the table. But even if we accept Daskal and Swire’s dire view of the state of current affairs, their argument leaves a lot unexplained—such as why an alternative framework or improved version of the CLOUD Act is not tenable, why efforts to pass the bill without any public markups of the legislation or the opportunity for amendments are advisable, and why no major international human rights organizations support it. Two of the largest human rights organizations, Amnesty International and Human Rights Watch, oppose the bill, along with over twenty other privacy and civil liberties organizations. (Swire and Daskal do note that some of these groups participated in a working group on this issue, though they don’t describe the strenuous objections made during that process.)

Most importantly, however, Daskal and Swire do not address how this bill could fail human rights activists and people around the world.

The very premise of the current CLOUD Act—the idea that countries can effectively be safe-listed as human-rights compliant, such that their individual data requests need no further human rights vetting—is wrong. The CLOUD Act requires the executive branch to certify each of these foreign governments as having “robust substantive and procedural protections for privacy and civil liberties” written into their domestic law. But many of the factors that must be considered provide merely a formalistic and even naïve measure of a government’s behavior. Flip through Amnesty International or Human Rights Watch’s recent annual reports, and you can find a dizzying array of countries that have ratified major human rights treaties and reflect those obligations in their domestic laws but, in fact, have arrested, tortured and killed people in retaliation for their activism or due to their identity.

In the case of countries certified by the executive branch certifies, the CLOUD Act would not require the U.S. government to scrutinize data requests by the foreign governments—indeed, the bill would not even require notifying the U.S. government or a user regarding a request. The only line of defense would be technology companies, which hypothetically could refuse the request and refer it to the MLA process, but which may not have the resources, expertise, or even financial incentive to deny a foreign government request. Likewise, the bill requires that countries permit “periodic” reviews for compliance with civil liberties and privacy protections, but does not specify what these reviews will entail. It also doesn’t require even a cursory individual review of all orders or explain how the U.S. government can effectively ensure compliance in a timely fashion when without being aware of requests in real time. For this reason, the periodic U.S. government reviews contemplated in the bill are an insufficient substitute for case-by-case consideration.

Daskal and Swire point to other safeguards: Judges or independent authorities in the foreign country would review their government’s requests for data, they argue. But what about when courts greenlight, rather than check, police and intelligence services to go after human rights activists? This is not a problem confined to a small set of countries. In 2016, Amnesty International recorded at least in which human rights defenders were detained or arrested based solely on their work.

Similarly, the CLOUD Act would not prevent harm to human rights activists and minorities in cases where a country experiences a rapid deterioration in human rights. Under the CLOUD Act, once a foreign government gets an international agreement, it is safe-listed for five years—with no built-in mechanism to ensure that the U.S. government acts quickly when there is a rapid change in circumstances.

For example, in early 2014, Turkey may have met the CLOUD Act’s vague human rights criteria; Freedom House even it a three and four on its index for political and civil rights. But since the attempted coup in mid-2016, the Turkish government has arrested —including journalists and activists such as the chair and director of Amnesty International’s Turkey section—many on bogus terrorism charges. According to : “Most of these accusations of terrorism are based solely on actions such as downloading data protection software, including the ByLock application, publishing opinions disagreeing with the Government’s anti-terrorism policies, organizing demonstrations, or providing legal representation for other activists.”

Under the CLOUD Act, neither Congress nor U.S. courts would be able to prompt a review or a temporary moratorium for a case like Turkey. Users, without notice, would have little practical ability to lodge complaints with the U.S. government or providers. Even if the U.S. government were to take action, the CLOUD Act fails to ensure a sufficiently quick response to protect activists and others whose safety could be threatened.

In such a situation, the only real fail-safe to prevent a technology company from inadvertently acceding to a harmful data request is the technology company itself. But would even a well-intentioned technology company, particularly a small one, have the expertise and resources to competently assess the risk that a foreign order may pose to a particular human rights activist? Would it know, as in the example above, when to view Turkey’s terrorism charges in a particular case as baseless? In many cases, companies would likely rely on the biased assessments by foreign courts and fulfill requests.

Daskal and Swire argue that without the CLOUD Act, foreign governments with poor privacy standards will turn to data localization, which would pose greater human rights risks. But if the bill’s criteria are as strong as needed to protect privacy and human rights, those same foreign governments will not qualify for an international agreement—and so they may still push for data localization. The bill also does nothing to prevent a foreign government with an international agreement from data localization. If a technology company refused a government’s requests, the government could threaten to retaliate with localization and pressure the company to comply.

Finally, Swire and Daskal fail to address the CLOUD Act’s numerous ambiguities as to what human rights standards are a predicate to inclusion in the new data club the bill purports to create. Indeed, many of the criteria listed are merely factors that must be considered, not mandatory requirements. To highlight just a handful of the deficiencies in the bill:

  • The bill states that the Justice Department must consider whether a country respects free expression, without stating whether free expression is defined under U.S. law, international law, or a country’s own domestic law;
  • The bill states the Justice Department must consider whether a country respects “international universal human rights” without definition or clarity regarding how to assess this (indeed, this is not a recognized term in U.S. or international law);
  • The bill requires that requests be based on “articulable and credible facts, particularity, legality, and severity regarding the conduct under investigations”—a standard that is, at best, vague and subject to different interpretations, and is likely lower than the current probable cause standard applied to requests;
  • The bill fails to prohibit agreements in cases in which a country has a pattern or practice of engaging in human rights abuses, nor does it require an assessment as to whether there is effective central control of law enforcement or intelligence units;
  • The bill fails to require that countries meet any standards for metadata requests—leaving companies free to provide this data to human rights abusing countries without restriction;
  • For the first time, the bill allows foreign governments to wiretap and intercept communications in real-time, without even requiring governments to adhere to critical privacy protections in the Wiretap Act (such as notice, probable cause, or a set duration); and
  • The bill permits broad information sharing between governments, allowing countries (including the U.S.) to obtain information from foreign partners under standards that may be lower than their own domestic law.

These ambiguities provide the Justice Department with significant flexibility regarding the human rights standards a country must meet. What’s more, there’s no way for Congress or the judicial branch to practically act as a check in cases in which the executive branch makes the wrong decision. Country determinations are not subject to U.S. judicial review, and Congress would need to pass legislation within 90 days, likely with a veto proof majority, to stop an agreement from going into effect—an extremely high hurdle that will be difficult to overcome.

In light of this, it’s far from clear that, as Daskal and Swire write, the bill “will raise privacy protections on a global scale.” If members of Congress and technology companies want to address concerns with the MLA process while protecting privacy and human rights, they should abandon the CLOUD Act and craft a rights-respecting solution. 

https://www.courthousenews.com/privacy-groups-denounce-proposed-global-data-sharing/

http://www.lawfareblog.com/cloud-act-doesnt-help-privacy-and-human-rights-it-hurts-them

see also related:

https://humanrightsdefenders.blog/2014/11/27/united-nations-declares-again-that-mass-surveillance-threatens-the-right-to-privacy/

https://humanrightsdefenders.blog/2014/12/02/ngos-concerned-about-alarming-proliferation-of-surveillance-technologies-to-repressive-countries-the-wassenaar-arrangement/

https://humanrightsdefenders.blog/2013/05/23/facebook-joins-the-global-network-initiative-for-human-rights/

BBC investigation on Arab States and import of cyber-surveillance tools

June 16, 2017

On 15 June 2017 the BBC came out with a special report on “How BAE sold cyber-surveillance tools to Arab states’A dancer tucks his Apple iPhone next to his traditional Omani dagger during a welcome ceremony in Muscat, Oman (5 November 2016).

A year-long investigation by BBC Arabic and a Danish newspaper [Dagbladet Information] has uncovered evidence that the UK defence giant BAE Systems has made large-scale sales across the Middle East of sophisticated surveillance technology, including to many repressive governments. These sales have also included decryption software which could be used against the UK and its allies. While the sales are legal, human rights campaigners and cyber-security experts have expressed serious concerns these powerful tools could be used to spy on millions of people and thwart any signs of dissent. The investigation began in the small Danish town of Norresundby, home to ETI, a company specialising in high-tech surveillance equipment. ETI developed a system called Evident, which enabled governments to conduct mass surveillance of their citizens’ communications. A former employee, speaking to the BBC anonymously, described how Evident worked. “You’d be able to intercept any internet traffic,” he said. “If you wanted to do a whole country, you could. You could pin-point people’s location based on cellular data. You could follow people around. They were quite far ahead with voice recognition. They were capable of decrypting stuff as well.”

 

Image copyright GETTY IMAGES

A video clip accompanying the article is to be found on the website of the BBC (see link below) and it features Ahmed Mansoor, the 2015 Laureate of the Martin Ennals Award.[https://humanrightsdefenders.blog/2017/03/21/ahmed-mansoor-mea-laureate-2015-arrested-in-middle-of-the-night-raid-in-emirates/]

One early customer of the new system was the Tunisian government. The BBC tracked down a former Tunisian intelligence official who operated Evident for the country’s veteran leader, President Zine al-Abidine Ben Ali. “ETI installed it and engineers came for training sessions,” he explained. “[It] works with keywords. You put in an opponent’s name and you will see all the sites, blogs, social networks related to that user.” The source says President Ben Ali used the system to crack down on opponents until his overthrow in January 2011, in the first popular uprising of the Arab Spring. As protests spread across the Arab world, social media became a key tool for organisers. Governments began shopping around for more sophisticated cyber-surveillance systems – opening up a lucrative new market for companies like BAE Systems. In 2011, BAE bought ETI and the company became part of BAE Systems Applied Intelligence. Over the next five years, BAE used its Danish subsidiary to supply Evident systems to many Middle Eastern countries with questionable human rights records (such as Saudi Arabia, the UAE, Qatar, Oman, Morocco and Algeria).

 

“I wouldn’t be exaggerating if I said more than 90% of the most active campaigners in 2011 have now vanished,” says Yahya Assiri, a former Saudi air force officer who fled the country after posting pro-democracy statements online.  “It used to be that ‘the walls have ears’, but now it’s ‘smartphones have ears,‘” says Manal al-Sharif, a Saudi women’s rights activist who also now lives abroad. “No country monitors its own people the way they do in the Gulf countries. They have the money, so they can buy advanced surveillance software.” [see also: https://humanrightsdefenders.blog/2013/12/13/five-women-human-rights-defenders-from-the-middle-east/]

Manal al-Sharif
Manal al-Sharif says Gulf states have the money to buy advanced surveillance equipment‘Responsible trading’

….The BBC has obtained a 2015 email exchange between the British and Danish export authorities in which the British side clearly expresses concern about this capability with reference to an Evident sale to the United Arab Emirates. “We would refuse a licence to export this cryptanalysis software from the UK because of Criteria 5 concerns,” says the email. [“Criteria 5” refers to the national security of the UK and its allies.]…Despite British objections, the Danish authorities approved the Evident export…..

…….Dutch MEP Marietje Schaake is one of the few European politicians prepared to discuss concerns about surveillance technology exports. She says European countries will ultimately pay a price for the compromises now being made. “Each and every case where someone is silenced or ends up in prison with the help of EU-made technologies I think is unacceptable,” she told the BBC. “I think the fact that these companies are commercial players, developing these highly sophisticated technologies that could have a deep impact on our national security, on people’s lives, requires us to look again at what kind of restrictions maybe be needed, what kind of transparency and accountability is needed in this market before it turns against our own interest and our own principles.

Source: How BAE sold cyber-surveillance tools to Arab states – BBC News

https://twitter.com/hashtag/freeahmed

Security Without Borders offers free security help to human rights defenders

January 10, 2017

Network World of 3 January 2017 carried an interesting piece on Claudio Guarnieri who launched Security Without Borders which offers free cybersecurity help to journalists, activists and human rights defenders.

For all the wonderful things that the internet has given us, the internet also has been turned into a tool for repression. Nation states have deep pockets and use the imbalance to their own advantage. Technology has been used “to curb dissent, to censor information, to identify and monitor people.” ..Billions of dollars have been poured into surveillance—both passive and active.”Sadly, electronic surveillance and censorship have become so commonplace that nowadays people can get arrested for a tweet. There are places were dissidents are hunted down, using crypto is illegal, where sites are blocked and even internet access can be cut off. “Those who face imprisonment and violence in the pursuit of justice and democracy cannot succeed if they don’t communicate securely as well as remain safe online.”

Security “is a precondition for privacy, which is the key enabler for freedom of expression.” He was not implying that the security should come from big firms, either, since big security businesses often need contracts with the government and are dependent on the national security sector. So, Guarnieri turned to the hacker community and launched Security Without Borders, which “is an open collective of hackers and cybersecurity professionals who volunteer with assisting journalists, human rights defenders, and non-profit organizations with cyber security issues.”

security without borders

The website Security Without Borders has a big red button labeled “Request Assistance.” Activists, journalists and human rights defenders are encouraged to reach out for help. The group of “penetration testers, malware analysts, developers, engineers, system administrators and hackers” from all walks of life offer cybersecurity help. We can assist with web security assessments, conduct breach investigations and analysis, and generally act as an advisor in questions pertaining to cybersecurity. As security services are often expensive to come by, SWB offers these services free to organizations and people fighting against human rights abuse, racism, and other injustices.

When requesting help, you are asked to give your name or organization’s name, an email address, a description of the work you do and what kind of help you need. Hackers and computer security geeks who support freedom of speech are also encouraged to reach out and volunteer their skills.

There is still on-going discussions on the mailing list on issues such as trust and where to draw the line for extending free help to specific groups. Security Without Borders is just getting off the ground, and will have to deal with some of the same problems that earlier efforts in this area face, see e.g:  https://humanrightsdefenders.blog/2016/08/25/datnav-new-guide-to-navigate-and-integrate-digital-data-in-human-rights-research/ and https://humanrightsdefenders.blog/2016/10/31/protecting-human-rights-defenders-from-hackers-and-improving-digital-security/

Sources:

Security Without Borders: Free security help for dissidents | Network World

http://motherboard.vice.com/read/hacker-claudio-guarnieri-security-without-borders-political-dissidents

United Nations declares again that mass surveillance threatens the right to privacy.

November 27, 2014

Several newspapers have reported on this matter but perhaps not many  in the US (see at the end). In an excellent blog post on 26 November 2014 Peter Micek and Javier Pallero give the background to this UN Declaration, for the second straight year, which states that government communications surveillance poses a threat to the right to privacy. I quote liberally from it:

Passed unanimously on Tuesday by the Third Committee, the resolution on “The right to privacy in the digital age” this time also calls for a permanent ‘office’ on the right to privacy. For that to happen, the Human Rights Council in Geneva will have to take action in March 2015 by creating a new “special rapporteur” on the right to privacy.

Background

In response to mass surveillance revelations in 2013, including news that their political leaders had been spied on, Brazil and Germany co-authored a unanimous resolution in the General Assembly. The resolution called for a report by the then High Commissioner for Human Rights, Navi Pillay, who came with a scathing critique in July 2014 that cited the need for immediate reform of surveillance laws and practices in line with international human rights norms. The report’s finding that mass surveillance inherently violates human rights spoke directly to the “five eyes” countries – the US, Canada, the UK, New Zealand, and Australia – who are responsible for weakening technical standards, collecting untold reams of data, and thwarting public debate over their practices.

Brazil and Germany again teamed up to lead this year’s effort, gathering more than 60 cosponsors. The resolution finds that surveillance must be “consistent with international human rights obligations and must be conducted on the basis of a legal framework, which must be publicly accessible, clear, precise, comprehensive and non-discriminatory.” It smartly calls for greater access to remedy for victims — a too-often ignored pillar of rights frameworks — and for increased attention to the role of private companies in government surveillance. In oral statements, the US and its allies in the “Five Eyes” drew attention to the resolution’s acknowledgment of “threats and harassment” that human rights defenders face along with privacy violations. And the resolution invites the Human Rights Council to “consider the possibility of establishing a special procedure” regarding the promotion and protection of the right to privacy.

Shortcomings

  • The resolution does not specifically call for governments to extend protections to users abroad.  Although expressing concern is important, governments must do much more to provide an effective solution to cross-border violations.
  • The resolutions language on restrictions is unnecessarily general (“non-arbitrary and lawful”) but it could have used findings by multiple courts and international experts more precisely defining how privacy rights should be handled – that surveillance and other privacy restrictions should only be prescribed by law, necessary to achieve a legitimate aim, and proportionate to the aim pursued. These concepts are further articulated in the International Principles on the Application of Human Rights to Communications Surveillance, which High Commissioner Pillay said in her report can be considered interpretive guidance of Article 17 of the ICCPR that establishes the right to privacy.
  • While the resolution notes that metadata can, when aggregated, “reveal personal information and can give an insight into an individual’s behaviour, social relationships, private preferences and identity,” it stops short of calling for an end to bulk metadata collection by governments, which the Human Rights Council has an opportunity to push for in March.

Summarizing, the authors of the post think that this resolution is a step in the right direction and “Access” will continue working to ensure the Council follows through on the General Assembly’s suggestion, and creates the Special Rapporteur.

https://www.accessnow.org/blog/2014/11/26/new-un-resolution-shifts-momentum-on-privacy-to-human-rights-council

In a related piece in ‘The Local’ one can read how Germany – at the heart of moves to limit the power of US web companies and their involvement in surveillance – is pressured by American companies and politicians.

 

http://www.thelocal.de/20141126/germany-denies-accusations-of-google-bashing

Internet guru Bruce Schneier will lecture on: Is it Possible to be Safe Online?

September 30, 2014

On 6 October 2014 Front Line Defenders will be hosting US computer privacy expert and “digital security guru” Bruce Schneier as the key-note speaker for their second Annual Lecture [for those in Ireland: at 6.30 pm in the Trinity Biomedical Science Institute – tickets are available at: https://bruceschneierdublin2014.eventbrite.ie].

This talk, entitled “Is it Possible to be Safe Online? Human Rights Defenders and the Internet”, will explore the issues faced by human rights defenders and everyday people on the ground as the use of computers and the Internet in their work is becoming increasingly commonplace and the threats posed by governments manipulating, monitoring and subverting electronic information, increased surveillance and censorship and the lack of security for digitally communicated and stored information is on the rise. Called a “security guru” by The Economist, Schneier has authored 12 books – including Liars and Outliers: Enabling the Trust Society Needs to Thrive – as well as hundred of article, essays and academic papers. His influential newsletter  Crypto-Gram and his blog Schneier on Security are read by over 250,000 worldwide.

via Is it Possible to be Safe Online? Human Rights Defenders & the Internet – lecture by Bruce Schneier – 06/10.