Categories
Intelwars International Privacy Standards News Update privacy Surveillance and Human Rights

Civil Society Groups Seek More Time to Review, Comment on Rushed Global Treaty for Intrusive Cross Border Police Powers

Electronic Frontier Foundation (EFF), European Digital Rights (EDRi), and 40 other civil society organizations urged the Council of Europe’s Parliament Assembly and Committee of Ministers to allow more time for them to provide much-needed analysis and feedback on the flawed cross border police surveillance treaty its cybercrime committee rushed to approve without adequate privacy safeguards.

Digital and human rights groups were largely sidelined and excluded during the drafting process of the Second Additional Protocol to the Budapest Convention, an international treaty that will establish global procedures for law enforcement in one country to access personal user data from technology companies in other countries. The CoE Cybercrime Committee (T-CY)—which oversees the Budapest Convention—adopted in 2017 internal rules that foster a narrower range of participants for the drafting of this new Protocol.

The process has been largely opaque, led by public safety and law enforcement officials. And T-CY’s periodic consultations with civil society and the public have been criticized for their lack of detail, their short response timelines, and the lack of knowledge about countries’ deliberation on these issues. The T-CY rushed approval of the text on May 28th, signing off on provisions that put few limitations and provide little oversight, on police access to sensitive user data held by Internet companies around the world.

The Protocol now heads to the Council of Europe Parliamentary Assembly (PACE) Committee on Legal Affairs and Human Rights, which can recommend further amendments. We hope the PACE will hear civil society’s privacy concerns and issue an opinion addressing the lack of adequate data protection safeguards. 

In a letterdated March 31st, to PACE President Rik Daems and Chair of the Committee of Ministers Péter Szijjártó, digital and human rights groups said the treaty will likely be used extensively, with far-reaching implications on the security and privacy of people everywhere. It is imperative that the fundamental rights guaranteed in the European Convention on Human Rights and in other agreements are not sidestepped even if they take time in favor of easier law enforcement access to user data free of judicial oversight and strong privacy protections. The CoE’s plan is to finalize the Protocol’s adoption by November and begin accepting signatures from countries sometime before 2022.

We know that the Council of Europe has set high standards for its consultative process and has a strong commitment to stakeholder engagement,” EFF and its allies said in the letter. “The importance of meaningful outreach is all the more important given the global reach of the draft Protocol, and the anticipated inclusion of many signatory parties who are not bound by the Council’s central human rights and data protection instruments.”

In 2018, EFF along with 93 civil society organizations from across asked the TC-Y to invite civil society as experts in the drafting plenary meetings, as is customary in other Council of Europe Committee sessions. The goal was to listen to Member States opinions and build upon the richness of the discussion among States and experts, a discussion that civil society missed since we were not invited to observe such drafting process. While  EFF has participated in every public consultation of the TC-Y process since our 2018 coalition letter, such level of participation has failed to comply with meaningful multi-stakeholder principles of transparency, inclusion and accountability”.  As Tamir Israel (CIPPIC) and Katitza Rodriguez (EFF) explained:

With limited incorporation of civil society input, it is perhaps no surprise that the final Protocol places law enforcement concerns first while human rights protections and privacy safeguards remain largely an afterthought. Instead of attempting to elevate global privacy protections, the Protocol’s central safeguards are left largely optional in an attempt to accommodate countries that lack adequate protections. As a result, the Protocol encourages global standards to harmonize at the lowest common denominator, weakening everyone’s right to privacy and free expression.

The full text of the letter:

Re: Ensuring Meaningful Consultation in Cybercrime Negotiations

We, the undersigned individuals and organizations, write to ask for a meaningful opportunity to give the final draft text of the proposed second additional protocol to Convention 185, the Budapest Cybercrime Convention, the full and detailed consideration which it deserves. We specifically ask that you provide external stakeholders further opportunity to comment on the significant changes introduced to the text on the eve of the final consultation round ending on 6th May, 2021.

The Second Additional Protocol aims to standardise cross-border access by law enforcement authorities to electronic personal data. While competing initiatives are also underway at the United Nations and the OECD, the draft Protocol has the potential to become the global standard for such cross-border access, not least because of the large number of states which have already ratified the principal Convention. In these circumstances, it is imperative that the Protocol should lay down adequate standards for the protection of fundamental rights.

Furthermore, the initiative comes at a time when even routine criminal investigations increasingly include cross-border investigative elements and, in consequence, the protocol is likely to be used widely. The protocol therefore assumes great significance in setting international standards, and is likely to be used extensively, with far-reaching implications for privacy and human rights around the world. It is important that its terms are carefully considered and ensure a proportionate balance between the objective of securing or recovering data for the purposes of law enforcement and the protection of fundamental rights guaranteed in the European Convention on Human Rights and in other relevant national and international instruments.

In light of the importance of this initiative, many of us have been following this process closely and have participated actively, including at the Octopus Conference in Strasbourg in November, 2019 and the most recent and final consultation round which ended on 6th May, 2021.

Although many of us were able to engage meaningfully with the text as it stood in past consultation rounds, it is significant that these earlier iterations of the text were incomplete and lacked provisions to protect the privacy of personal data. In the event, the complete text of the draft Protocol was not publicly available before 12th April, 2021. The complete draft text introduces a number of significant alterations, most notably the inclusion of Article 14, which added for the first time proposed minimum standards for privacy and data protection. While external stakeholders were previously notified that these provisions were under active consideration and would be published in due course, the publication of the revised draft on 12th April offered the first opportunity to examine these provisions and consider other elements of the Protocol in the full light of these promised protections.

We were particularly pleased to see the addition of Article 14, and welcome its important underlying intent—to balance law enforcement objectives with fundamental rights. However, the manner in which this is done is, of necessity, complex and intricate, and, even on a cursory preliminary examination, it is apparent that there are elements of the article which require careful and thoughtful scrutiny, in the light of which might be capable of improvement.[1]

As a number of stakeholders has noted,[2] the latest (and final) consultation window was too short. It is essential that adequate time is afforded to allow a meaningful analysis of this provision and that all interested parties be given a proper chance to comment. We believe that such continued engagement can serve only to improve the text.

The introduction of Article 14 is particularly detailed and transformative in its impact on the entirety of the draft Protocol. Keeping in mind the multiple national systems potentially impacted by the draft Protocol, providing meaningful feedback on this long anticipated set of safeguards within the comment window has proven extremely difficult for civil society groups, data protection authorities and a wide range of other concerned experts.

Complicating our analysis further are gaps in the Explanatory Report accompanying the draft Protocol. We acknowledge that the Explanatory Report might continue to evolve, even after the Protocol itself is finalised, but the absence of elaboration on a pivotal provision such as Article 14 poses challenges to our understanding of its implications and our resulting ability meaningfully to engage in this important treaty process.

We know that the Council of Europe has set high standards for its consultative process and has a strong commitment to stakeholder engagement. The importance of meaningful outreach is all the more important given the global reach of the draft Protocol, and the anticipated inclusion of many signatory parties who are not bound by the Council’s central human rights and data protection instruments. Misalignments between Article 14 and existing legal frameworks on data protection such as Convention 108/108+ similarly demand careful scrutiny so that their implications are fully understood.

In these circumstances, we anticipate that the Council will wish to accord the highest priority to ensuring that fundamental rights are adequately safeguarded and that the consultation process is sufficiently robust to instill public confidence in the Protocol across the myriad jurisdictions which are to consider its adoption. The Council will, of course, appreciate that these objectives cannot be achieved without meaningful stakeholder input.

We are anxious to assist the Council in this process. In that regard, constructive stakeholder engagement requires a proper opportunity fully to assess the draft protocol in its entirety, including the many and extensive changes introduced in April 2021. We anticipate that the Council will share this concern, and to that end we respectfully suggest that the proposed text (inclusive of a completed explanatory report) be widely disseminated and that a minimum period of 45 days be set aside for interested stakeholders to submit comments.

We do realise that the T-CY Committee had hoped for an imminent conclusion to the drafting process. That said, adding a few months to a treaty process that has already spanned several years of internal drafting is both necessary and proportionate, particularly when the benefits of doing so will include improved public accountability and legitimacy, a more effective framework for balancing law enforcement objectives with fundamental rights, and a finalised text that reflects the considered input of civil society.

We very much look forward to continuing our engagement with the Council both on this and on future matters.

With best regards,

  1. Electronic Frontier Foundation (international)
  2. European Digital Rights (European Union)
  3. The Council of Bars and Law Societies of Europe (CCBE) (European Union)
  4. Access Now (International)
  5. ARTICLE19 (Global)
  6. ARTICLE19 Brazil and South America
  7. Association for Progressive Communications (APC)
  8. Association of Technology, Education, Development, Research and Communication – TEDIC (Paraguay)
  9. Asociación Colombiana de Usuarios de Internet (Colombia)
  10. Asociación por los Derechos Civiles (ADC) (Argentina)
  11. British Columbia Civil Liberties Association (Canada)
  12. Chaos Computer Club e.V. (Germany)
  13. Content Development & Intellectual Property (CODE-IP) Trust (Kenya)
  14. net (Sweden)
  15. Derechos Digitales (Latinoamérica)
  16. Digitale Gesellschaft (Germany)
  17. Digital Rights Ireland (Ireland)
  18. Danilo Doneda, Director of Cedis/IDP and member of the National Council for Data Protection and Privacy (Brazil)
  19. Electronic Frontier Finland (Finland)
  20. works (Austria)
  21. Fundación Acceso (Centroamérica)
  22. Fundacion Karisma (Colombia)
  23. Fundación Huaira (Ecuador)
  24. Fundación InternetBolivia.org (Bolivia)
  25. Hiperderecho (Peru)
  26. Homo Digitalis (Greece)
  27. Human Rights Watch (international)
  28. Instituto Panameño de Derecho y Nuevas Tecnologías – IPANDETEC (Central America)
  29. Instituto Beta: Internet e Democracia – IBIDEM (Brazil)
  30. Institute for Technology and Society – ITS Rio (Brazil)
  31. International Civil Liberties Monitoring Group (ICLMG)
  32. Iuridicium Remedium z.s. (Czech Republic)
  33. IT-Pol Denmark (Denmark)
  34. Douwe Korff, Emeritus Professor of International Law, London Metropolitan University
  35. Laboratório de Políticas Públicas e Internet – LAPIN (Brazil)
  36. Laura Schertel Mendes, Professor, Brasilia University and Director of Cedis/IDP (Brazil)
  37. Open Net Korea (Korea)
  38. OpenMedia (Canada)
  39. Privacy International (international)
  40. R3D: Red en Defensa de los Derechos Digitales (México)
  41. Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic – CIPPIC (Canada)
  42. Usuarios Digitales (Ecuador)
  43. org (Netherlands)
  44. Xnet (Spain)

[1] See, for example Access Now, comments on the draft 2nd Additional Protocol to the Budapest Convention on Cybercrime, available at: https://rm.coe.int/0900001680a25783; EDPB, contribution to the 6th round of consultations on the draft Second Additional Protocol to the Council of Europe Budapest Convention on Cybercrime, available at: https://edpb.europa.eu/system/files/2021-05/edpb_contribution052021_6throundconsultations_budapestconvention_en.pdf.

[2] Alessandra Pierucci, Correspondence to Ms. Chloé Berthélémy, dated 17 May 2021; Consultative Committee of the Convention for the Protection of Individuals with Regard to Automated Processing of Personal Data, Directorate General Human Rights and Rule of Law, Opinion on Draft Second Additional Protocol, May 7, 2021, https://rm.coe.int/opinion-of-the-committee-of-convention-108-on-the-draft-second-additio/1680a26489; EDPB, see footnote 1; Joint Civil Society letter, 2 May: available at https://edri.org/wp-content/uploads/2021/05/20210420_LetterCoECyberCrimeProtocol_6thRound.pdf.  

Share
Categories
Intelwars News Update transparency

Federal Court Agrees: Prosecutors Can’t Keep Forensic Evidence Secret from Defendants

When the government tries to convict you of a crime, you have a right to challenge its evidence. This is a fundamental principle of due process, yet prosecutors and technology vendors have routinely argued against disclosing how forensic technology works.

For the first time, a federal court has ruled on the issue, and the decision marks a victory for civil liberties.

EFF teamed up with the ACLU of Pennsylvania to file an amicus brief arguing in favor of defendants’ rights to challenge complex DNA analysis software that implicates them in crimes. The prosecution and the technology vendor Cybergenetics opposed disclosure of the software’s source code on the grounds that the company has a commercial interest in secrecy.

The court correctly determined that this secrecy interest could not outweigh a defendant’s rights and ordered the code disclosed to the defense team. The disclosure will be subject to a “protective order” that bars further disclosure, but in a similar previous case a court eventually allowed public scrutiny of source code of a different DNA analysis program after a defense team found serious flaws.

This is the second decision this year ordering the disclosure of the secret TrueAllele software. This added scrutiny will help ensure that the software does not contribute to unjust incarceration.

Share
Categories
free speech Intelwars Necessary and Proportionate News Update privacy

Turkey’s Free Speech Clampdown Hits Twitter, Clubhouse — But Most of All, The Turkish People

EFF has been tracking the Turkish government’s crackdown on tech platforms and its continuing efforts to force them to comply with draconian rules on content control and access to users’ data. As of now, the Turkish government has now managed to coerce Facebook, YouTube, and TikTok into appointing a legal representative to comply with the legislation via threats to their bottom line: prohibiting Turkish taxpayers from placing ads and making payments to them if they fail to appoint a legal representative. According to local news, Google is the latest company to have appointed a representative through a shell company in Turkey. 

Out of the major foreign social media platforms used in Turkey, only Twitter has not appointed a local representative and subject itself to Turkish jurisdiction over its content and users’ policies. Coincidentally, Twitter has been drawn into a series of moderation decisions that push the company into direct conflict with Turkish politicians. On February 2nd, Twitter decided that three tweets by the Turkish Interior Minister Süleyman Soylu violated its rules about hateful conduct and abusive behavior policy. Access to these tweets was restricted rather than removed as Twitter considered them still in the public interest. Similarly, Twitter decided to remove and delete a tweet by the AKP coalition MHP leader Devlet Bahçel, where he tweeted that student protestors were “terrorists” and “poisonous snakes” “whose heads needed to be crushed”, as the tweet violated Twitter’s violent threats policy.

Yaman Akdeniz, a founder of the Turkish Freedom of Expression Association, told EFF 

“This is the first time Twitter deployed its policy on Turkish politicians while the company is yet to decide whether to have a legal representative in Turkey as required by Internet Social Media Law since October 2020.

As in many other countries, politicians in Turkey are now angry at Twitter both for failing to sufficiently censor criticism of Turkish policies, and for sanctioning senior domestic political figures for their violations of the platform’s terms of service. 

By attempting to avoid both forms of political pressure by declining to elect a local representative, Twitter is already paying a price. The Turkish regulator BTK has already imposed the first set of sanctions by forbidding Turkish taxpayers from paying for ads on Twitter. In principle, BTK can go further later this spring. It will be permitted to apply for sanctions against Twitter starting in April 2021, including ordering ISPs to throttle the speed of Turkish users’ connections to Twitter, at first by 50% and subsequently by up to 90%. Throttling can make sites practically inaccessible within Turkey, fortifying Turkey’s censorship machine and silencing speech–a disproportionate measure that profoundly limits users’ ability to access online content within Turkey.

The Turkish Constitutional Court has overturned previous complete bans on Wikipedia in 2019 and Twitter and YouTube back in 2014. Even though the recent legislation “only” foresees throttling sites’ access speeds by 50% or 90%, this sanction aims to make sites unusable in practice and should be viewed by the Court the same way as an outright ban. Research on website usability has already found that huge numbers of users will lose patience with only slightly slower sites than they expect; Delays of just “1 second” are enough to interrupt a person’s conscious thought process; making users wait five or ten times as long would be catastrophic.

But if the Turkish authorities think that throttling away major platforms that refuse to comply with its orders, they may have another problem. The new Internet Social Media law covers any social network provider that exceeds a “daily access” of one million. While the law is unclear as to what that figure means in practice, it wasn’t intended to cover smaller alternatives — like Clubhouse, the new invitation-only audio-chat social networking, iOS-only app. Inevitably, with Twitter throttled and other services suspected of being required to comply with Turkish government demands, that’s exactly where political conversations have shifted. 

During the recent crackdown, Clubhouse has hosted Turkish groups every night until after midnight, where students, academics, journalists, and sometimes politicians join the conversations. For now, Turkish speech enforcement is falling back to other forms of intimidation. At least four students were recently taken into custody. Although the government said the arrests related to the students’ use of other social media platforms, the students believe that their Clubhouse activity was the only thing that distinguished them from thousands of others.

Clubhouse, as with many other fledglings, general-purpose social media networks, has not accounted for its use as a platform by endangered voices. It has a loosely-enforced real names policy — one of the reasons why the students were able to be targeted by law enforcement. And as the Stanford Internet Observatory discovered, its design potentially allowed government actors or other network spies to collect private data on its users, en masse.

Ultimately, while it’s the major tech companies who face legal sanctions and service interruptions under Turkey’s Social Media Law, it’s ordinary Turkish citizens who are really paying the price: whether it’s slower Internet services, navigated cowed social platforms, or, physical arrest for simply speaking out online on platforms that cannot yet adequately protect them from their own government.

Share
Categories
DMCA Intelwars News Update

Section 1201’s Harm to Security Research Shown by Mixed Decision in Corellium Case

Under traditional copyright law, security research is a well-established fair use, meaning it does not infringe copyright. When it was passed in 1998, Section 1201 of the Digital Millennium Copyright Act upset the balance of copyright law. Since then, the balance has been further upset as it has been interpreted so broadly by some courts that it effectively eliminates fair use if you have to bypass an access control like encryption to make that fair use.

The District Court’s ruling in Apple v. Corellium makes this shift crystal-clear. Corellium is a company that enables security researchers to run smartphone software in a controlled, virtual environment, giving them greater insights into how the software functions and where it may be vulnerable. Apple sued the company, alleging that its interactions with Apple code infringed copyright and that it offered unlawful circumvention technology under Section 1201 of the Digital Millennium Copyright Act.

Corellium asked for “summary judgment” that it had not violated the law. Summary judgment is decided as a matter of law when the relevant facts are not in dispute. (Summary judgment is far less expensive and time-consuming for the parties and the courts, while having to go to trial can be prohibitive for individual researchers and small businesses.) Corellium won on fair use, but the court said that there were disputed facts that prevented it from ruling on the Section 1201 claims at this stage of the litigation. It also rejected Corellium’s argument that fair use is a defense to a claim under Section 1201.

Fair use is part of what makes copyright law consistent with both the First Amendment and the Constitution’s requirement that intellectual monopoly rights like copyright – if created at all – must promote the progress of “science and the useful arts.”

We’re disappointed that the District Court failed to uphold the traditional limitations on copyright law that protect speech, research, and innovation. Applying fair use to Section 1201 would reduce the harm it does to fundamental rights.

It’s also disappointing that the provisions of Section 1201 that were enacted to protect security testing are so much less protective than traditional fair use has been. If those provisions were doing their job, the 1201 claim would be thrown out on summary judgment just as readily as the infringement claim, saving defendants and the courts from unnecessary time and expense.

We’ll continue to litigate Section 1201 to protect security researchers and the many other technologists and creators who rely on fair use in order to share their knowledge and creativity.

Share
Categories
Intelwars News Update transparency

No Secret Evidence in Our Courts

If you’re accused of a crime, you have a right to examine and challenge the evidence used against you. In an important victory, an appeals court in New Jersey agreed with EFF and the ACLU of NJ that a defendant is entitled to see the source code of software that’s used to generate evidence against them.

The case of New Jersey v. Pickett involves complex DNA analysis using TrueAllele software. The software analyzed a DNA sample obtained by swabbing a weapon, a sample that likely contained the DNA of multiple people. It then asserted that it was likely that the defendant, Corey Pickett, had contributed DNA to that sample, implicating him in the crime.

But when the defense team wanted to analyze how that software arrived at that conclusion, the prosecutors and the software vendor insisted that it was a secret. They argued that the defense team shouldn’t be allowed to look at how the software actually worked, because the vendor has a commercial interest in preventing competitors from knowing its trade secrets.

The court correctly ruled in favor of the defendant’s right to understand and challenge the software being used to implicate him. The code will not be publicly disclosed, but will be made available to the defense team. The defense needs this information about TrueAllele so that it can fairly participate in a procedural step known as a Frye hearing, used to ensure that a defendant’s rights are not undermined through the introduction of unreliable expert evidence.

In previous instances, defense experts have found fatal flaws in this kind of software. For instance, a complex DNA analysis program called “FST” was shown to have an undisclosed function in the code with the potential to tip the scales against a defendant. After the defense team found the issue, journalists at ProPublica persuaded the court to have the source code disclosed to the public.

This issue has arisen all around the country, and we have filed multiple briefs in different courts warning of the danger of secret software being used to convict criminal defendants. No one should be imprisoned or executed based on secret evidence that cannot be fairly evaluated for its reliability, and the ruling in this case will help prevent that injustice.

Share
Categories
Intelwars net neutrality News Update

Broad Coalition Urges Court Not to Block California’s Net Neutrality Law

After the federal government rolled back net neutrality protections for consumers in 2017, California stepped up and passed a bill that does what FCC wouldn’t: bar telecoms from blocking and throttling Internet content and imposing paid prioritization schemes. The law, SB 822, ensures that that all Californians have full access to all Internet content and services—at lower prices.

Partnering with the ACLU of Northern California and numerous other public interest advocates, businesses and educators, EFF filed an amicus brief today urging a federal court to reject the telecom industry’s attempt to block enforcement of SB 822. The industry is claiming that California’s law is preempted by federal law—despite a court ruling that said the FCC can’t impose nationwide preemption of state laws protecting net neutrality.

Without legal protections, low-income Californians who rely on mobile devices for internet access and can’t pay for more expensive content are at a real disadvantage. Their ISPs could inhibit full access to the Internet, which is critical for distance learning, maintaining small businesses, and staying connected. Schools and libraries are justifiably concerned that without net neutrality protections, paid prioritization schemes will degrade access to material that students and public need in order to learn. SB 822 addresses that by ensuring that large ISPs do not take advantage of their stranglehold on Californians’ Internet access to slow or otherwise manipulate Internet traffic.

The large ISPs also have a vested interest in shaping Internet use to favor their own subsidiaries and business partners, at the expense of diverse voices and competition. Absent meaningful competition, ISPs have every incentive to leverage their last-mile monopolies to customers’ homes and bypass competition for a range of online services. That would mean less choice, lower quality, and higher prices for Internet users—and new barriers to entry for innovators. SB 822 aims to keep the playing field level for everyone.

These protections are important all of the time, but doubly so in crises like the ones California now faces: a pandemic, the resulting economic downturn, and a state wildfire emergency. And Internet providers have shown that they are not above using emergencies to exploit their gatekeeper power for financial gain. Just two years ago, when massive fires threatened the lives of rescuers, emergency workers, and residents, the Santa Clara fire department found that it’s “unlimited” data plan was being throttled by Verizon. Internet access on a vehicle the department was using to coordinate its fire response slowed to a crawl. When contacted, the company told firefighters that they needed to pay more for a better plan.

Without SB 822, Californians – and not just first responders – could find themselves in the same situation as the Santa Clara Fire Department: unable, thanks to throttling or other restrictions, to access information they need or connect with others. We hope the court recognizes how important SB 822 is and why the telecom lobby shouldn’t be allowed to block its enforcement.

Share
Categories
Bloggers' Rights free speech Intelwars News Update Security

Lawsuit in India Seeks to Shut Down Access to U.S. Journalism Website

Computer security researchers and journalists play a critical role in uncovering flaws in software and information systems. Their research and reporting allows users to protect themselves, and vendors to repair their products before attackers can exploit security flaws. But all too often, corporations and governments try to silence reporters, and punish the people who expose these flaws to the public.

This dynamic is playing out right now in a court in India, where a company is seeking to block Indian readers from accessing journalism by the American security journalist known as Dissent Doe. If it succeeds, more than a billion people in India would be blocked from reading Dissent Doe’s reporting.

Here’s what happened: last summer, Dissent Doe discovered that an employee wellness company was leaking patients’ private counseling information on the publicly available Web. Dissent alerted the company, called 1to1Help, so that it could secure its patients’ records. After Dissent repeatedly contacted the company, it finally secured the confidential data, a month after Dissent first notified them of the breach.

At that point—once the leak was fixed, and the data was no longer available to malicious actors—Dissent wrote about the breach on the website DataBreach.net, where Dissent reports on significant security flaws.

At first, 1to1Help seems to have recognized the strong public interest in having these types of vulnerabilities exposed. After fixing the breach, the company emailed Dissent to express its thanks for alerting the company, and allowing it to strengthen its data security.

A few weeks later, however, the company took a different tack. It filed a meritless criminal complaint against Dissent in the Bangalore City Civil Court alleging that Dissent “hacked” its patient files—even though the complaint itself acknowledges that the patient files were available to anyone on the public Web, until Dissent alerted the company about this flaw. The criminal complaint also alleges that Dissent’s emails requesting comment for the DataBreach.net story were “blackmail.”

Thankfully, any judgment against Dissent Doe in India would be unenforceable in the United States thanks to the protections of an important law called the Securing the Protection of Our Enduring and Established Constitutional Heritage (SPEECH) Act. Under the SPEECH Act, foreign orders aren’t enforceable in the United States unless they are consistent with the free speech protections that the U.S. and state constitutions guarantee, as well as with state laws.

But the injunction that 1to1Help is asking for would prevent Dissent’s website, DataBreaches.net, from being accessed by anyone in India. And if 1to1Help’s meritless lawsuit succeeds, other companies would surely follow suit in order to block Indians’ access to journalism online.

We hope the court in India decides to adhere to global principles of freedom of speech, and of the press. It should throw this dangerous lawsuit out of court.

Share