Normal view

Received before yesterday

DSA Human Rights Alliance Publishes Principles Calling for DSA Enforcement to Incorporate Global Perspectives

28 January 2026 at 02:16

The Digital Services Act (DSA) Human Rights Alliance has, since its founding by EFF and Access Now in 2021, worked to ensure that the European Union follows a human rights-based approach to platform governance by integrating a wide range of voices and perspectives to contextualise DSA enforcement and examining the DSA’s effect on tech regulations around the world.

As the DSA moves from legislation to enforcement, it has become increasingly clear that its impact depends not only on the text of the Act but also how it’s interpreted and enforced in practice. This is why the Alliance has created a set of recommendations to include civil society organizations and rights-defending stakeholders in the enforcement process. 

 The Principles for a Human Rights-Centred Application of the DSA: A Global Perspective, a report published this week by the Alliance, outlines steps the European Commission, as the main DSA enforcer, as well as national policymakers and regulators, should take to bring diverse groups to the table as a means of ensuring that the implementation of the DSA is grounded in human rights standards.

 The Principles also offer guidance for regulators outside the EU who look to the DSA as a reference framework and international bodies and global actors concerned with digital governance and the wider implications of the DSA. The Principles promote meaningful stakeholder engagement and emphasize the role of civil society organisations in providing expertise and acting as human rights watchdogs.

“Regulators and enforcers need input from civil society, researchers, and affected communities to understand the global dynamics of platform governance,” said EFF International Policy Director Christoph Schmon. “Non-EU-based civil society groups should be enabled to engage on equal footing with EU stakeholders on rights-focused elements of the DSA. This kind of robust engagement will help ensure that DSA enforcement serves the public interest and strengthens fundamental rights for everyone, especially marginalized and vulnerable groups.”

“As activists are increasingly intimidated, journalists silenced, and science and academic freedom attacked by those who claim to defend free speech, it is of utmost importance that the Digital Services Act's enforcement is centered around the protection of fundamental rights, including the right to the freedom of expression,” said Marcel Kolaja, Policy & Advocacy Director—Europe at Access Now. “To do so effectively, the global perspective needs to be taken into account. The DSA Human Rights Principles provide this perspective and offer valuable guidance for the European Commission, policymakers, and regulators for implementation and enforcement of policies aiming at the protection of fundamental rights.”

“The Principles come at the crucial moment for the EU candidate countries, such as Serbia, that have been aligning their legislation with the EU acquis but still struggle with some of the basic rule of law and human rights standards,” said Ana Toskic Cvetinovic, Executive Director for Partners Serbia. “The DSA HR Alliance offers the opportunity for non-EU civil society to learn about the existing challenges of DSA implementation and design strategies for impacting national policy development in order to minimize any negative impact on human rights.”

 The Principles call for:

◼ Empowering EU and non-EU Civil Society and Users to Pursue DSA Enforcement Actions

◼ Considering Extraterritorial and Cross-Border Effects of DSA Enforcement

◼ Promoting Cross-Regional Collaboration Among CSOs on Global Regulatory Issues

◼ Establishing Institutionalised Dialogue Between EU and Non-EU Stakeholders

◼ Upholding the Rule of Law and Fundamental Rights in DSA Enforcement, Free from Political Influence

◼ Considering Global Experiences with Trusted Flaggers and Avoid Enforcement Abuse

◼ Recognising the International Relevance of DSA Data Access and Transparency Provisions for Human Rights Monitoring

The Principles have been signed by 30 civil society organizations,researchers, and independent experts.

The DSA Human Right Alliance represents diverse communities across the globe to ensure that the DSA embraces a human rights-centered approach to platform governance and that EU lawmakers consider the global impacts of European legislation.

 

EFF to California Appeals Court: First Amendment Protects Journalist from Tech Executive’s Meritless Lawsuit

16 January 2026 at 16:22

EFF asked a California appeals court to uphold a lower court’s decision to strike a tech CEO’s lawsuit against a journalist that sought to silence reporting the CEO, Maury Blackman, didn’t like.

The journalist, Jack Poulson, reported on Maury Blackman’s arrest for felony domestic violence after receiving a copy of the arrest report from a confidential source. Blackman didn’t like that. So, he sued Poulson—along with Substack, Amazon Web Services, and Poulson’s non-profit, Tech Inquiry—to try and force Poulson to take his articles down from the internet.

Fortunately, the trial court saw this case for what it was: a classic SLAPP, or a strategic lawsuit against public participation. The court dismissed the entire complaint under California’s anti-SLAPP statute, which provides a way for defendants to swiftly defeat baseless claims designed to chill their free speech.

The appeals court should affirm the trial court’s correct decision.  

Poulson’s reporting is just the kind of activity that the state’s anti-SLAPP law was designed to protect: truthful speech about a matter of public interest. The felony domestic violence arrest of the CEO of a controversial surveillance company with U.S. military contracts is undoubtedly a matter of public interest. As we explained to the court, “the public has a clear interest in knowing about the people their government is doing business with.”

Blackman’s claims are totally meritless, because they are barred by the First Amendment. The First Amendment protects Poulson’s right to publish and report on the incident report. Blackman argues that a court order sealing the arrest overrides Poulson’s right to report the news—despite decades of Supreme Court and California Court of Appeals precedent to the contrary. The trial correctly rejected this argument and found that the First Amendment defeats all of Blackman’s claims. As the trial court explained, “the First Amendment’s protections for the publication of truthful speech concerning matters of public interest vitiate Blackman’s merits showing.”

The court of appeals should reach the same conclusion.

EU's New Digital Package Proposal Promises Red Tape Cuts but Guts GDPR Privacy Rights

4 December 2025 at 13:04

The European Commission (EC) is considering a “Digital Omnibus” package that would substantially rewrite EU privacy law, particularly the landmark General Data Protection Regulation (GDPR). It’s not a done deal, and it shouldn’t be.

The GDPR is the most comprehensive model for privacy legislation around the world. While it is far from perfect and suffers from uneven enforcement, complexities and certain administrative burdens, the omnibus package is full of bad and confusing ideas that, on balance, will significantly weaken privacy protections for users in the name of cutting red tape.

It contains at least one good idea: improving consent rules so users can automatically set consent preferences that will apply across all sites. But much as we love limiting cookie fatigue, it’s not worth the price users will pay if the rest of the proposal is adopted. The EC needs to go back to the drawing board if it wants to achieve the goal of simplifying EU regulations without gutting user privacy.

Let’s break it down. 

 Changing What Constitutes Personal Data 

 The digital package is part of a larger Simplification Agenda to reduce compliance costs and administrative burdens for businesses, echoing the Draghi Report’s call to boost productivity and support innovation. Businesses have been complaining about GDPR red tape since its inception, and new rules are supposed to make compliance easier and turbocharge the development of AI in the EU. Simplification is framed as a precondition for firms to scale up in the EU, ironically targeting laws that were also argued to promote innovation in Europe. It might also stave off tariffs the U.S. has threatened to levy, thanks in part to heavy lobbying from Meta and tech lobbying groups.  

 The most striking proposal seeks to narrow the definition of personal data, the very basis of the GDPR. Today, information counts as personal data if someone can reasonably identify a person from it, whether directly or by combining it with other information.  

 The proposal jettisons this relatively simple test in favor of a variable one: whether data is “personal” depends on what a specific entity says it can reasonably do or is likely to do with it. This selectively restates part of a recent ruling by the EU Court of Justice but ignores the multiple other cases that have considered the issue. 

 This structural move toward entity specific standards will create massive legal and practical confusion, as the same data could be treated as personal for some actors but not for others. It also creates a path for companies to avoid established GDPR obligations via operational restructuring to separate identifiers from other information—a change in paperwork rather than in actual identifiability. What’s more, it will be up to the Commission, a political executive body, to define what counts as unidentifiable pseudonymized data for certain entities.

Privileging AI 

In the name of facilitating AI innovation, which often relies on large datasets in which sensitive data may residually appear, the digital package treats AI development as a “legitimate interest,” which gives AI companies a broad legal basis to process personal data, unless individuals actively object. The proposals gesture towards organisational and technical safeguards but leave companies broad discretion.  

 Another amendment would create a new exemption that allows even sensitive personal data to be used for AI systems under some circumstances. This is not a blanket permission:  “organisational and technical measures” must be taken to avoid collecting or processing such data, and proportionate efforts must be taken to remove them from AI models or training sets where they appear. However, it is unclear what will count as an appropriate or proportionate measures.

Taken together with the new personal data test, these AI privileges mean that core data protection rights, which are meant to apply uniformly, are likely to vary in practice depending on a company’s technological and commercial goals.  

And it means that AI systems may be allowed to process sensitive data even though non-AI systems that could pose equal or lower risks are not allowed to handle it

A Broad Reform Beyond the GDPR

There are additional adjustments, many of them troubling, such as changes to rules on automated-decision making (making it easier for companies to claim it’s needed for a service or contract), reduced transparency requirements (less explanation about how users’ data are used), and revised data access rights (supposed to tackle abusive requests). An extensive analysis by NGO noyb can be found here 

Moreover, the digital package reaches well beyond the GDPR, aiming to streamline Europe’s digital regulatory rulebook, including the e-Privacy Directive, cybersecurity rules, the AI Act and the Data Act. The Commission also launched “reality checks” of other core legislation, which suggests it is eyeing other mandates.

Browser Signals and Cookie Fatigue

There is one proposal in the Digital Omnibus that actually could simplify something important to users: requiring online interfaces to respect automated consent signals, allowing users to automatically reject consent across all websites instead of clicking through cookie popups on each. Cookie popups are often designed with “dark patterns” that make rejecting data sharing harder than accepting it. Automated signals can address cookie banner fatigue and make it easier for people to exercise their privacy rights. 

While this proposal is a step forward, the devil is in the details: First, the exact format of the automated consent signal will be determined by technical standards organizations where Big Tech companies have historically lobbied for standards that work in their favor. The amendments should therefore define minimum protections that cannot be weakened later. 

Second, the provision takes the important step of requiring web browsers to make it easy for users sending this automated consent signal, so they can opt-out without installing a browser add-on. 

However, mobile operating systems are excluded from this latter requirement, which is a significant oversight. People deserve the same privacy rights on websites and mobile apps. 

Finally, exempting media service providers altogether creates a loophole that lets them keep using tedious or deceptive banners to get consent for data sharing. A media service’s harvesting of user information on its website to track its customers is distinct from news gathering, which should be protected. 

A Muddled Legal Landscape

The Commission’s use of the "Omnibus" process is meant to streamline lawmaking by bundling multiple changes. An earlier proposal kept the GDPR intact, focusing on easing the record-keeping obligation for smaller businesses—a far less contentious measure. The new digital package instead moves forward with thinner evidence than a substantive structural reform would require, violating basic Better Regulation principles, such as coherence and proportionality.

The result is the opposite of  “simple.” The proposed delay of the high-risk requirements under the AI Act to late 2027—part of the omnibus package—illustrates this: Businesses will face a muddled legal landscape as they must comply with rules that may soon be paused and later revived again. This sounds like "complification” rather than simplification.

The Digital Package Is Not a Done Deal

Evaluating existing legislation is part of a sensible legislative cycle and clarifying and simplifying complex process and practices is not a bad idea. Unfortunately, the digital package misses the mark by making processes even more complex, at the expense of personal data protection. 

Simplification doesn't require tossing out digital rights. The EC should keep that in mind as it launches its reality check of core legislation such as the Digital Services Act and Digital Markets Act, where tidying up can too easily drift into a verschlimmbessern, the kind of well-meant fix that ends up resembling the infamous ecce homo restoration. 

❌