Normal view

Received before yesterday

Statutory Damages: The Fuel of Copyright-based Censorship

20 January 2026 at 16:15

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, and addressing what's at stake, and what we need to do to make sure that copyright promotes creativity and innovation.

Imagine every post online came with a bounty of up to $150,000 paid to anyone who finds it violates opaque government rules—all out of the pocket of the platform. Smaller sites could be snuffed out, and big platforms would avoid crippling liability by aggressively blocking, taking down, and penalizing speech that even possibly violates these rules. In turn, users would self-censor, and opportunists would turn accusations into a profitable business.

This dystopia isn’t a fantasy, it’s close to how U.S. copyright’s broken statutory damages regime actually works.

Copyright includes "statutory damages,” which means letting a jury decide how big of a penalty the defendant will have to pay—anywhere from $200 to $150,000 per work—without the jury necessarily seeing any evidence of actual financial losses or illicit profits. In fact, the law gives judges and juries almost no guidelines on how to set damages. This is a huge problem for online speech.

One way or another, everyone builds on the speech of others when expressing themselves online: quoting posts, reposting memes, sharing images from the news. For some users, re-use is central to their online expression: parodists, journalists, researchers, and artists use others’ words, sounds, and images as part of making something new every day. Both these users and the online platforms they rely on risk unpredictable, potentially devastating penalties if a copyright holder objects to some re-use and a court disagrees with the user’s well-intentioned efforts.

On Copyright Week, we like to talk about ways to improve copyright law. One of the most important would be to fix U.S. copyright’s broken statutory damages regime. In other areas of civil law, the courts have limited jury-awarded punitive damages so that they can’t be far higher than the amount of harm caused. Extremely large jury awards for fraud, for example, have been found to offend the Constitution’s Due Process Clause. But somehow, that’s not the case in copyright—some courts have ruled that Congress can set damages that are potentially hundreds of times greater than actual harm.

Massive, unpredictable damages awards for copyright infringement, such as a $222,000 penalty for sharing 24 music tracks online, are the fuel that drives overzealous or downright abusive takedowns of creative material from online platforms. Capricious and error-prone copyright enforcement bots, like YouTube’s Content ID, were created in part to avoid the threat of massive statutory damages against the platform. Those same damages create an ever-present bias in favor of major rightsholders and against innocent users in the platforms’ enforcement decisions. And they stop platforms from addressing the serious problems of careless and downright abusive copyright takedowns.

By turning litigation into a game of financial Russian roulette, statutory damages also discourage artistic and technological experimentation at the boundaries of fair use. None but the largest corporations can risk ruinous damages if a well-intentioned fair use crosses the fuzzy line into infringement.

“But wait”, you might say, “don’t legal protections like fair use and the safe harbors of the Digital Millennium Copyright Act protect users and platforms?” They do—but the threat of statutory damages makes that protection brittle. Fair use allows for many important re-uses of copyrighted works without permission. But fair use is heavily dependent on circumstances and can sometimes be difficult to predict when copyright is applied to new uses. Even well-intentioned and well-resourced users avoid experimenting at the boundaries of fair use when the cost of a court disagreeing is so high and unpredictable.

Many reforms are possible. Congress could limit statutory damages to a multiple of actual harm. That would bring U.S. copyright in line with other countries, and with other civil laws like patent and antitrust. Congress could also make statutory damages unavailable in cases where the defendant has a good-faith claim of fair use, which would encourage creative experimentation. Fixing fair use would make many of the other problems in copyright law more easily solvable, and create a fairer system for creators and users alike.

Site Blocking Laws Will Always Be a Bad Idea: 2025 in Review

30 December 2025 at 16:46

This year, we fought back against the return of a terrible idea that hasn’t improved with age: site blocking laws. 

More than a decade ago, Congress tried to pass SOPA and PIPA—two sweeping bills that would have allowed the government and copyright holders to quickly shut down entire websites based on allegations of piracy. The backlash was massive. Internet users, free speech advocates, and tech companies flooded lawmakers with protests, culminating in an “Internet Blackout” on January 18, 2012. Turns out, Americans don’t like government-run internet blacklists. The bills were ultimately shelved.  

But we’ve never believed they were gone for good. The major media and entertainment companies that backed site blocking in the US in 2012 turned to pushing for site-blocking laws in other countries. Rightsholders continued to ask US courts for site-blocking orders, often winning them without a new law. And sure enough, the Motion Picture Association (MPA) and its allies have asked Congress to try again. 

There were no less than three Congressional drafts of site-blocking legislation. Representative Zoe Lofgren kicked off the year with the Foreign Anti-Digital Piracy Act (FADPA). Fellow House of Representatives member Darrell Issa also claimed to be working on a bill that would make it offensively easy for a studio to block your access to a website based solely on the belief that there is infringement happening. Not to be left out, the Senate Judiciary Committee produced the terribly named Block BEARD Act 

None of these three attempts to fundamentally alter the way you experience the internet moved too far after their press releases. But the number tells us that there is, once again, an appetite among major media conglomerates and politicians to resurrect SOPA/PIPA from the dead.  

None of these proposals fixes the flaws of SOPA/PIPA, and none ever could. Site blocking is a flawed idea and a disaster for free expression that no amount of rewriting will fix. There is no way to create a fast lane for removing your access to a website that is not a major threat to the open web. Just as we opposed SOPA/PIPA over ten years ago, we oppose these efforts.  

Site blocking bills seek to build a new infrastructure of censorship into the heart of the internet. They would enable court orders directed to the organizations that make the internet work, like internet service providers, domain name resolvers, and reverse proxy services, compelling them to help block US internet users from visiting websites accused of copyright infringement. The technical means haven’t changed much since 2012. - tThey involve blocking Internet Protocol addresses or domain names of websites. These methods are blunt—sledgehammers rather than scalpels. Today, many websites are hosted on cloud infrastructure or use shared IP addresses. Blocking one target can mean blocking thousands of unrelated sites. That kind of digital collateral damage has already happened in Austria, Italy, South Korea, France, and in the US, to name just a few.  

Given this downside, one would think the benefits of copyright enforcement from these bills ought to be significant. But site blocking is trivially easy to evade. Determined site owners can create the same content on a new domain within hours. Users who want to see blocked content can fire up a VPN or change a single DNS setting to get back online.  

The limits that lawmakers have proposed to put on these laws are an illusion. While ostensibly aimed at “foreign” websites, they sweep in any website that doesn’t conspicuously display a US origin, putting anonymity at risk. And despite the rhetoric of MPA and others that new laws would be used only by responsible companies against the largest criminal syndicates, laws don’t work that way. Massive new censorship powers invite abuse by opportunists large and small, and the costs to the economy, security, and free expression are widely borne. 

It’s time for Big Media and its friends in Congress to drop this flawed idea. But as long as they keep bringing it up, we’ll keep on rallying internet users of all stripes to fight it. 

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

Fair Use is a Right. Ignoring It Has Consequences.

18 December 2025 at 15:54

Fair use is not just an excuse to copy—it’s a pillar of online speech protection, and disregarding it in order to lash out at a critic should have serious consequences. That’s what we told a federal court in Channel 781 News v. Waltham Community Access Corporation, our case fighting copyright abuse on behalf of citizen journalists.

Waltham Community Access Corporation (WCAC), a public access cable station in Waltham, Massachusetts, records city council meetings on video. Channel 781 News (Channel 781), a group of volunteers who report on the city council, curates clips from those recordings for its YouTube channel, along with original programming, to spark debate on issues like housing and transportation. WCAC sent a series of takedown notices under the Digital Millennium Copyright Act (DMCA), accusing Channel 781 of copyright infringement. That led to YouTube deactivating Channel 781’s channel just days before a critical municipal election. Represented by EFF and the law firm Brown Rudnick LLP, Channel 781 sued WCAC for misrepresentations in its takedown notices under an important but underutilized provision of the DMCA.

The DMCA gives copyright holders a powerful tool to take down other people’s content from platforms like YouTube. The “notice and takedown” process requires only an email, or filling out a web form, in order to accuse another user of copyright infringement and have their content taken down. And multiple notices typically lead to the target’s account being suspended, because doing so helps the platform avoid liability. There’s no court or referee involved, so anyone can bring an accusation and get a nearly instantaneous takedown.

Of course, that power invites abuse. Because filing a DMCA infringement notice is so easy, there’s a temptation to use it at the drop of a hat to take down speech that someone doesn’t like. To prevent that, before sending a takedown notice, a copyright holder has to consider whether the use they’re complaining about is a fair use. Specifically, the copyright holder needs to form a “good faith belief” that the use is not “authorized by the law,” such as through fair use.

WCAC didn’t do that. They didn’t like Channel 781 posting short clips from city council meetings recorded by WCAC as a way of educating Waltham voters about their elected officials. So WCAC fired off DMCA takedown notices at many of Channel 781’s clips that were posted on YouTube.

WCAC claims they considered fair use, because a staff member watched a video about it and discussed it internally. But WCAC ignored three of the four fair use factors. WCAC ignored that their videos had no creativity, being nothing more than records of public meetings. They ignored that the clips were short, generally including one or two officials’ comments on a single issue. They ignored that the clips caused WCAC no monetary or other harm, beyond wounded pride. And they ignored facts they already knew, and that are central to the remaining fair use factor: by excerpting and posting the clips with new titles, Channel 781 was putting its own “spin” on the material - in other words, transforming it. All of these facts support fair use.

Instead, WCAC focused only on the fact that the clips they targeted were not altered further or put into a larger program. Looking at just that one aspect of fair use isn’t enough, and changing the fair use inquiry to reach the result they wanted is hardly the way to reach a “good faith belief.”

That’s why we’re asking the court to rule that WCAC’s conduct violated the law and that they should pay damages. Copyright holders need to use the powerful DMCA takedown process with care, and when they don’t, there needs to be consequences.

Protecting Access to the Law—and Beneficial Uses of AI

30 September 2025 at 00:26

As the first copyright cases concerning AI reach appeals courts, EFF wants to protect important, beneficial uses of this technology—including AI for legal research. That’s why we weighed in on the long-running case of Thomson Reuters v. ROSS Intelligence. This case raises at least two important issues: the use of (possibly) copyrighted material to train a machine learning AI system, and public access to legal texts.  

ROSS Intelligence was a legal research startup that built an AI-based tool for locating judges’ written opinions based on natural language queries—a competitor to ubiquitous legal research platforms like Lexis and Thomson Reuters’ Westlaw. To build its tool, ROSS hired another firm to read through thousands of the “West headnotes” that Thomson Reuters adds to the legal decisions it publishes, paraphrasing the individual legal conclusions (what lawyers call “holdings”) that the headnotes identified. ROSS used those paraphrases to train its tool. Importantly, the ROSS tool didn’t output any West headnotes, or even the paraphrases of those headnotesit simply directed the user to the original judges’ decisions. Still, Thomson sued ROSS for copyright infringement, arguing that using the headnotes without permission was illegal.  

Early decisions in the suit were encouraging. EFF wrote about how the court allowed ROSS to bring an antitrust counterclaim against Thomson Reuters, letting them try to prove that Thomson was abusing monopoly power. And the trial judge initially ruled that ROSS’s use of the West headnotes was fair use under copyright law. 

The case then took turns for the worse. ROSS was unable to prove its antitrust claim. The trial judge issued a new opinion reversing his earlier decision and finding that ROSS’s use was not fair but rather infringed Thomson’s copyrights. And in the meantime, ROSS had gone out of business (though it continues to defend itself in court).  

The court’s new decision on copyright was particularly worrisome. It ruled that West headnotes—a few lines of text copying or summarizing a single legal conclusion from a judge’s written opinion—could be copyrighted, and that using them to train the ROSS tool was not fair use, in part because ROSS was a competitor to Thomson Reuters. And the court rejected ROSS’s attempt to avoid any illegal copying by using a “clean room” procedure often used in software development. The decision also threatens to limit the public’s access to legal texts. 

EFF weighed in with an amicus brief joined by the American Library Association, the Association of Research Libraries, the Internet Archive, Public Knowledge, and Public.Resource.Org. We argued that West headnotes are not copyrightable in the first place, since they simply restate individual points from judges’ opinions with no meaningful creative contributions. And even if copyright does attach to the headnotes, we argued, the source material is entirely factual statements about what the law is, and West’s contribution was minimal, so fair use should have tipped in ROSS’s favor. The trial judge had found that the factual nature of the headnotes favored ROSS, but dismissed this factor as unimportant, effectively writing it out of the law. 

This case is one of the first to touch on copyright and AI, and is likely to influence many of the other cases that are already pending (with more being filed all the time). That’s why we’re trying to help the appeals court get this one right. The law should encourage the creation of AI tools to digest and identify facts for use by researchers, including facts about the law. 

❌