Normal view

Received before yesterday

Application Gatekeeping: An Ever-Expanding Pathway to Internet Censorship

3 November 2025 at 15:57

It’s not news that Apple and Google use their app stores to shape what apps you can and cannot have on many of your devices. What is new is more governments—including the U.S. government—using legal and extralegal tools to lean on these gatekeepers in order to assert that same control. And rather than resisting, the gatekeepers are making it easier than ever. 

Apple’s decision to take down the ICEBlock app at least partially in response to threats from the U.S. government—with Google rapidly and voluntarily following suit—was bad enough. But it pales in comparison with Google’s new program, set to launch worldwide next year, requiring developers to register with the company in order to have their apps installable on Android certified devices—including paying a fee and providing personal information backed by government-issued identification. Google claims the new program of “is an extra layer of security that deters bad actors and makes it harder for them to spread harm,” but the registration requirements are barely tied to app effectiveness or security. Why, one wonders, does Google need to see your driver’s license to evaluate whether your app is safe?  Why, one also wonders, does Google want to create a database of virtually every Android app developer in the world? 

Those communities are likely to drop out of developing for Android altogether, depriving all Android users of valuable tools. 

F-Droid, a free and open-source repository for Android apps, has been sounding the alarm. As they’ve explained in an open letter, Google’s central registration system will be devastating for the Android developer community. Many mobile apps are created, improved, and distributed by volunteers, researchers, and/or small teams with limited financial resources. Others are created by developers who do not use the name attached to any government-issued identification. Others may have good reason to fear handing over their personal information to Google, or any other third party. Those communities are likely to drop out of developing for Android altogether, depriving all Android users of valuable tools. 

Google’s promise that it’s “working on” a program for “students and hobbyists” that may have different requirements falls far short of what is necessary to alleviate these concerns. 

It’s more important than ever to support technologies which decentralize and democratize our shared digital commons. A centralized global registration system for Android will inevitably chill this work. 

The point here is not that all the apps are necessarily perfect or even safe. The point is that when you set up a gate, you invite authorities to use it to block things they don’t like. And when you build a database, you invite governments (and private parties) to try to get access to that database. If you build it, they will come.  

Imagine you have developed a virtual private network (VPN) and corresponding Android mobile app that helps dissidents, journalists, and ordinary humans avoid corporate and government surveillance. In some countries, distributing that app could invite legal threats and even prosecution. Developers in those areas should not have to trust that Google would not hand over their personal information in response to a government demand just because they want their app to be installable by all Android users. By the same token, technologists that work on Android apps for reporting ICE misdeeds should not have to worry that Google will hand over their personal information to, say, the U.S. Department of Homeland Security. 

It’s easy to see how a new registration requirement for developers could give Google a new lever for maintaining its app store monopoly

Our tech infrastructure’s substantial dependence on just a few platforms is already creating new opportunities for those platforms to be weaponized to serve all kinds of disturbing purposes, from policing to censorship. In this context, it’s more important than ever to support technologies which decentralize and democratize our shared digital commons. A centralized global registration system for Android will inevitably chill this work. 

Not coincidentally, the registration system Google announced would also help cement Google’s outsized competitive power, giving the company an additional window—if it needed one, given the company’s already massive surveillance capabilities—into what apps are being developed, by whom, and how they are being distributed. It’s more than ironic that Google’s announcement came at the same time the company is fighting a court order (in the Epic Games v. Google lawsuit) that will require it to stop punishing developers who distribute their apps through app stores that compete with Google’s own. It’s easy to see how a new registration requirement for developers, potentially enforced by technical measures on billions of Android certified mobile devices, could give Google a new lever for maintaining its app store monopoly.  

EFF has signed on to F-Droid’s open letter. If you care about taking back control of tech, you should too. 

EFF to Court: The DMCA Didn't Create a New Right of Attribution, You Shouldn't Either

18 July 2025 at 16:37

Amid a wave of lawsuits targeting how AI companies use copyrighted works to train large language models that generate new works, a peculiar provision of copyright law is suddenly in the spotlight: Section 1202 of the Digital Millennium Copyright Act (DMCA). Section 1202 restricts intentionally removing or changing copyright management information (CMI), such as a signature on a painting or attached to a photograph. Passed in 1998, the rule was supposed to help rightsholders identify potentially infringing uses of their works and encourage licensing.

Open AI and Microsoft used code from Github as part of the training data for their LLMs, along with billions of other works. A group of anonymous Github contributors sued, arguing that those LLMs generated new snippets of code that were substantially similar to theirs—but with the CMI stripped. Notably, they did not claim that the new code was copyright infringement—they are relying solely on Section 1202 of the DMCA. Their problem? The generated code is different from their original work, and courts across the US have adopted an “identicality rule,” on the theory that Section 1202 is supposed to apply only when CMI is removed from existing works, not when it’s simply missing from a new one.

It may sound like an obscure legal question, but the outcome of this battle—currently before the Ninth Circuit Court of Appeals—could have far-reaching implications beyond generative AI technologies. If the rightholders were correct, Section 1202 effectively creates a freestanding right of attribution, creating potential liability even for non-infringing uses, such as fair use, if those new uses simply omit the CMI. While many fair users might ultimately escape liability under other limitations built into Section 1202, the looming threat of litigation, backed by risk of high and unpredictable statutory penalties, will be enough to pressure many defendants to settle. Indeed, an entire legal industry of “copyright trolls” has emerged to exploit this dynamic, with no corollary benefit to creativity or innovation.

Fortunately, as we explain in a brief filed today, the text of Section 1202 doesn’t support such an expansive interpretation. The provision repeatedly refers to “works” and “copies of works”—not “substantially similar” excerpts or new adaptations—and its focus on “removal or alteration” clearly contemplates actions taken with respect to existing works, not new ones. Congress could have chosen otherwise and written the law differently. Wisely it did not, thereby ensuring that rightsholders couldn’t leverage the omission of CMI to punish or unfairly threaten otherwise lawful re-uses of a work.

Given the proliferation of copyrighted works in virtually every facet of daily life, the last thing any court should do is give rightsholders a new, freestanding weapon against fair uses. As the Supreme Court once observed, copyright is a “tax on readers for the purpose of giving a bounty to writers.” That tax—including the expense of litigation—can be an important way to encourage new creativity, but it should not be levied unless the Copyright Act clearly requires it.

❌