Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

A Wider View on TunnelVision and VPN Advice

29 May 2024 at 01:04

If you listen to any podcast long enough, you will almost certainly hear an advertisement for a Virtual Private Network (VPN). These advertisements usually assert that a VPN is the only tool you need to stop cyber criminals, malware, government surveillance, and online tracking. But these advertisements vastly oversell the benefits of VPNs. The reality is that VPNs are mainly useful for one thing: routing your network connection through a different network. Many people, including EFF, thought that VPNs were also a useful tool for encrypting your traffic in the scenario that you didn’t trust the network you were on, such as at a coffee shop, university, or hacker conference. But new research from Leviathan Security demonstrates a reminder that this may not be the case and highlights the limited use-cases for VPNs.

TunnelVision is a recently published attack method that can allow an attacker on a local network to force internet traffic to bypass your VPN and route traffic over an attacker-controlled channel instead. This allows the attacker to see any unencrypted traffic (such as what websites you are visiting). Traditionally, corporations deploy VPNs for employees to access private company sites from other networks. Today, many people use a VPN in situations where they don't trust their local network. But the TunnelVision exploit makes it clear that using an untrusted network is not always an appropriate threat model for VPNs because they will not always protect you if you can't trust your local network.

TunnelVision exploits the Dynamic Host Configuration Protocol (DHCP) to reroute traffic outside of a VPN connection. This preserves the VPN connection and does not break it, but an attacker is able to view unencrypted traffic. Think of DHCP as giving you a nametag when you enter the room at a networking event. The host knows at least 50 guests will be in attendance and has allocated 50 blank nametags. Some nametags may be reserved for VIP guests, but the rest can be allocated to guests if you properly RSVP to the event. When you arrive, they check your name and then assign you a nametag. You may now properly enter the room and be identified as "Agent Smith." In the case of computers, this “name” is the IP address DHCP assigns to devices on the network. This is normally done by a DHCP server but one could manually try it by way of clothespins in a server room.

TunnelVision abuses one of the configuration options in DHCP, called Option 121, where an attacker on the network can assign a “lease” of IPs to a targeted device. There have been attacks in the past like TunnelCrack that had similar attack methods, and chances are if a VPN provider addressed TunnelCrack, they are working on verifying mitigations for TunnelVision as well.

In the words of the security researchers who published this attack method:

“There’s a big difference between protecting your data in transit and protecting against all LAN attacks. VPNs were not designed to mitigate LAN attacks on the physical network and to promise otherwise is dangerous.”

Rather than lament the many ways public, untrusted networks can render someone vulnerable, there are many protections provided by default that can assist as well. Originally, the internet was not built with security in mind. Many have been working hard to rectify this. Today, we have other many other tools in our toolbox to deal with these problems. For example, web traffic is mostly encrypted with HTTPS. This does not change your IP address like a VPN could, but it still encrypts the contents of the web pages you visit and secures your connection to a website. Domain Name Servers (which occur before HTTPS in the network stack) have also been a vector for surveillance and abuse, since the requested domain of the website is still exposed at this level. There have been wide efforts to secure and encrypt this as well. Availability for encrypted DNS and HTTPS by default now exists in every major browser, closing possible attack vectors for snoops on the same network as you. Lastly, major browsers have implemented support for Encrypted Client Hello (ECH). Which encrypts your initial website connection, sealing off metadata that was originally left in cleartext.

TunnelVision is a reminder that we need to clarify what tools can and cannot do. A VPN does not provide anonymity online and neither can encrypted DNS or HTTPS (Tor can though). These are all separate tools that handle similar issues. Thankfully, HTTPS, encrypted DNS, and encrypted messengers are completely free and usable without a subscription service and can provide you basic protections on an untrusted network. VPNs—at least from providers who've worked to mitigate TunnelVision—remain useful for routing your network connection through a different network, but they should not be treated as a security multi-tool.

Four Infosec Tools for Resistance this International Women’s Day 

While online violence is alarmingly common globally, women are often more likely to be the target of mass online attacks, nonconsensual leaks of sensitive information and content, and other forms of online violence. 

This International Women’s Day, visit EFF’s Surveillance Self-Defense (SSD) to learn how to defend yourself and your friends from surveillance. In addition to tutorials for installing and using security-friendly software, SSD walks you through concepts like making a security plan, the importance of strong passwords, and protecting metadata.

1. Make Your Own Security Plan

This IWD, learn what a security plan looks like and how you can build one. Trying to protect your online data—like pictures, private messages, or documents—from everything all the time is impractical and exhausting. But, have no fear! Security is a process, and through thoughtful planning, you can put together a plan that’s best for you. Security isn’t just about the tools you use or the software you download. It begins with understanding the unique threats you face and how you can counter those threats. 

2. Protect Yourself on Social Networks

Depending on your circumstances, you may need to protect yourself against the social network itself, against other users of the site, or both. Social networks are among the most popular websites on the internet. Facebook, TikTok, and Instagram each have over a billion users. Social networks were generally built on the idea of sharing posts, photographs, and personal information. They have also become forums for organizing and speaking. Any of these activities can rely on privacy and pseudonymity. Visit our SSD guide to learn how to protect yourself.

3. Tips for Attending Protests

Keep yourself, your devices, and your community safe while you make your voice heard. Now, more than ever, people must be able to hold those in power accountable and inspire others through the act of protest. Protecting your electronic devices and digital assets before, during, and after a protest is vital to keeping yourself and your information safe, as well as getting your message out. Theft, damage, confiscation, or forced deletion of media can disrupt your ability to publish your experiences, and those engaging in protest may be subject to search or arrest, or have their movements and associations surveilled. 

4. Communicate Securely with Signal or WhatsApp

Everything you say in a chat app should be private, viewable by only you and the person you're talking with. But that's not how all chats or DMs work. Most of those communication tools aren't end-to-end encrypted, and that means that the company who runs that software could view your chats, or hand over transcripts to law enforcement. That's why it's best to use a chat app like Signal any time you can. Signal uses end-to-end encryption, which means that nobody, not even Signal, can see the contents of your chats. Of course, you can't necessarily force everyone you know to use the communication tool of your choice, but thankfully other popular tools, like Apple's Messages, WhatsApp and more recently, Facebook's Messenger, all use end-to-end encryption too, as long as you're communicating with others on those same platforms. The more people who use these tools, even for innocuous conversations, the better.

On International Women’s Day and every day, stay safe out there! Surveillance self-defense can help.

This blog is part of our International Women’s Day series. Read other articles about the fight for gender justice and equitable digital rights for all.

  1. Four Reasons to Protect the Internet this International Women’s Day
  2. Four Voices You Should Hear this International Women’s Day
  3. Four Actions You Can Take To Protect Digital Rights this International Women’s Day

Celebrating 15 Years of Surveillance Self-Defense

4 March 2024 at 13:59

On March 3rd, 2009, we launched Surveillance Self-Defense (SSD). At the time, we pitched it as, "an online how-to guide for protecting your private data against government spying." In the last decade hundreds of people have contributed to SSD, over 20 million people have read it, and the content has nearly doubled in length from 40,000 words to almost 80,000. SSD has served as inspiration for many other guides focused on keeping specific populations safe, and those guides have in turn affected how we've approached SSD. A lot has changed in the world over the last 15 years, and SSD has changed with it. 

The Year Is 2009

Let's take a minute to travel back in time to the initial announcement of SSD. Launched with the support of the Open Society Institute, and written entirely by just a few people, we detailed exactly what our intentions were with SSD at the start:

EFF created the Surveillance Self-Defense site to educate Americans about the law and technology of communications surveillance and computer searches and seizures, and to provide the information and tools necessary to keep their private data out of the government's hands… The Surveillance Self-Defense project offers citizens a legal and technical toolkit with tips on how to defend themselves in case the government attempts to search, seize, subpoena or spy on their most private data.

screenshot of SSD in 2009, with a red logo and a block of text

SSD's design when it first launched in 2009.

To put this further into context, it's worth looking at where we were in 2009. Avatar was the top grossing movie of the year. Barack Obama was in his first term as president in the U.S. In a then-novel approach, Iranians turned to Twitter to organize protests. The NSA has a long history of spying on Americans, but we hadn't gotten to Jewel v. NSA or the Snowden revelations yet. And while the iPhone had been around for two years, it hadn't seen its first big privacy controversy yet (that would come in December of that year, but it'd be another year still before we hit the "your apps are watching you" stage).

Most importantly, in 2009 it was more complicated to keep your data secure than it is today. HTTPS wasn't common, using Tor required more technical know-how than it does nowadays, encrypted IMs were the fastest way to communicate securely, and full-disk encryption wasn't a common feature on smartphones. Even for computers, disk encryption required special software and knowledge to implement (not to mention time, solid state drives were still extremely expensive in 2009, so most people still had spinning disk hard drives, which took ages to encrypt and usually slowed down your computer significantly).

And thus, SSD in 2009 focused heavily on law enforcement and government access with its advice. Not long after the launch in 2009, in the midst of the Iranian uprising, we launched the international version, which focused on the concerns of individuals struggling to preserve their right to free expression in authoritarian regimes.

And that's where SSD stood, mostly as-is, for about six years. 

The Redesigns

In 2014, we redesigned and relaunched SSD with support from the Ford Foundation. The relaunch had at least 80 people involved in the writing, reviewing, design, and translation process. With the relaunch, there was also a shift in the mission as the threats expanded from just the government, to corporate and personal risks as well. From the press release:

"Everyone has something to protect, whether it's from the government or stalkers or data-miners," said EFF International Director Danny O'Brien. "Surveillance Self-Defense will help you think through your personal risk factors and concerns—is it an authoritarian government you need to worry about, or an ex-spouse, or your employer?—and guide you to appropriate tools and practices based on your specific situation."

SSD screenshot from 2014, with a logo with two keys, crossed and a block of text

2014 proved to be an effective year for a major update. After the murders of Michael Brown and Eric Garner, protestors hit the streets across the U.S., which made our protest guide particularly useful. There were also major security vulnerabilities that year, like Heartbleed, which caused all sorts of security issues for website operators and their visitors, and Shellshock, which opened up everything from servers to cameras to bug exploits, ushering in what felt like an endless stream of software updates on everything with a computer chip in it. And of course, there was still fallout from the Snowden leaks in 2013.

In 2018 we did another redesign, and added a new logo for SSD that came along with EFF's new design. This is more or less the same design of the site today.

SSD's current design, with an infinity logo wrapped around a lock and key

SSD's current design, which further clarifies what sections a guide is in, and expands the security scenarios.

Perhaps the most notable difference between this iteration of SSD and the years before is the lack of detailed reasoning explaining the need for its existence on the front page. No longer was it necessary to explain why we all need to practice surveillance self-defense. Online surveillance had gone mainstream.

Shifting Language Over the Years

As the years passed and the site was redesigned, we also shifted how we talked about security. In 2009 we wrote about security with terms like, "adversaries," "defensive technology," "threat models," and "assets." These were all common cybersecurity terms at the time, but made security sound like a military exercise, which often disenfranchised the very people who needed help. For example, in the later part of the 2010s, we reworked the idea of "threat modeling," when we published Your Security Plan. This was meant to be less intimidating and more inclusive of the various types of risks that people face.

The advice in SSD has changed over the years, too. Take passwords as an example, where in 2009 we said, "Although we recommend memorizing your passwords, we recognize you probably won't." First off, rude! Second off, maybe that could fly with the lower number of accounts we all had back in 2009, but nowadays nobody is going to remember hundreds of passwords. And regardless, that seems pretty dang impossible when paired with the final bit of advice, "You should change passwords every week, every month, or every year — it all depends on the threat, the risk, and the value of the asset, traded against usability and convenience."

Moving onto 2015, we phrased this same sentiment much differently, "Reusing passwords is an exceptionally bad security practice, because if an attacker gets hold of one password, she will often try using that password on various accounts belonging to the same person… Avoiding password reuse is a valuable security precaution, but you won't be able to remember all your passwords if each one is different. Fortunately, there are software tools to help with this—a password manager."

Well, that's much more polite!

Since then, we've toned that down even more, "Reusing passwords is a dangerous security practice. If someone gets ahold of your password —whether that's from a data breach, or wherever else—they can often gain access to any other account you used that same password. The solution is to use unique passwords everywhere and take additional steps to secure your accounts when possible."

Security is an always evolving process, so too is how we talk about it. But the more people we bring on board, the better it is for everyone. How we talk about surveillance self-defense will assuredly continue to adapt in the future.

Shifting Language(s) Over the Years

Initially in 2009, SSD was only available in English, and soon after launch, in Bulgarian. In the 2014 re-launch, we added Arabic and Spanish. Then added French, Thai, Vietnamese, and Urdu in 2015. Later that year, we added a handful of Amharic translations, too. This was accomplished through a web of people in dozens of countries who volunteered to translate and review everything. Many of these translations were done for highly specific reasons. For example, we had a Google Policy Fellow, Endalk Chala, who was part of the Zone 9 bloggers in Ethiopia. He translated everything into Amharic as he was fighting for his colleagues and friends who were imprisoned in Ethiopia on terrorism charges.

By 2019, we were translating most of SSD into at least 10 languages: Amharic, Arabic, Spanish, French, Russian, Turkish, Vietnamese, Brazilian Portuguese, Thai, and Urdu (as well as additional, externally-hosted community translations in Indonesian Bahasa, Burmese, Traditional Chinese, Igbo, Khmer, Swahili, Yoruba, and Twi).

Currently, we're focusing on getting the entirety of SSD re-translated into seven languages, then focusing our efforts on translating specific guides into other languages. 

Always Updating

Since 2009, we've done our best to review and update the guides in SSD. This has included minor changes to respond to news events, depreciating guides completely when they're no longer applicable in modern security plans, and massive rewrites when technology has changed.

The original version of SSD was launched mostly as a static text (we even offered a printer-friendly version), though updates and revisions did occur, they were not publicly tracked as clearly as they are today. In its early years, SSD was able to provide useful guidance across a number of important events, like Occupy Wall Street, before the major site redesign in 2014, which helped it become more useful training activists, including for Ferguson and Standing Rock, amongst others. The ability to update SSD along with changing trends and needs has ensured it can always be useful as a resource.

That redesign also better facilitated the updates process. The site became easier to navigate and use, and easier to update. For example, in 2017 we took on a round of guide audits in response to concerns following the 2016 election. In 2019 we continued that process with around seven major updates to SSD, and in 2020, we did five. We don't have great stats for 2021 and 2022, but in 2023 we managed 14 major updates or new guides. We're hoping to have the majority of SSD reviewed and revamped by the end of this year, with a handful of expansions along the way.

Which brings us to the future of SSD. We will continue updating, adapting, and adding to SSD in the coming years. It is often impossible to know what will be needed, but rest assured we'll be there to answer that whenever we can. As mentioned above, this includes getting more translations underway, and continuing to ensure that everything is accurate and up-to-date so SSD can remain one of the best repositories of security information available online.

We hope you’ll join EFF in celebrating 15 years of SSD!

Privacy Isn't Dead. Far From It.

13 February 2024 at 19:07

Welcome! 

The fact that you’re reading this means that you probably care deeply about the issue of privacy, which warms our hearts. Unfortunately, even though you care about privacy, or perhaps because you care so much about it, you may feel that there's not much you (or anyone) can really do to protect it, no matter how hard you try. Perhaps you think “privacy is dead.” 

We’ve all probably felt a little bit like you do at one time or another. At its worst, this feeling might be described as despair. Maybe it hits you because a new privacy law seems to be too little, too late. Or maybe you felt a kind of vertigo after reading a news story about a data breach or a company that was vacuuming up private data willy-nilly without consent. 

People are angry because they care about privacy, not because privacy is dead.

Even if you don’t have this feeling now, at some point you may have felt—or possibly will feel—that we’re past the point of no return when it comes to protecting our private lives from digital snooping. There are so many dangers out there—invasive governments, doorbell cameras, license plate readers, greedy data brokers, mismanaged companies that haven’t installed any security updates in a decade. The list goes on.

This feeling is sometimes called “privacy nihilism.” Those of us who care the most about privacy are probably more likely to get it, because we know how tough the fight is. 

We could go on about this feeling, because sometimes we at EFF have it, too. But the important thing to get across is that this feeling is valid, but it’s also not accurate. Here’s why.

You Aren’t Fighting for Privacy Alone

For starters, remember that none of us are fighting alone. EFF is one of dozens, if not hundreds,  of organizations that work to protect privacy.  EFF alone has over thirty-thousand dues-paying members who support that fight—not to mention hundreds of thousands of supporters subscribed to our email lists and social media feeds. Millions of people read EFF’s website each year, and tens of millions use the tools we’ve made, like Privacy Badger. Privacy is one of EFF’s biggest concerns, and as an organization we have grown by leaps and bounds over the last two decades because more and more people care. Some people say that Americans have given up on privacy. But if you look at actual facts—not just EFF membership, but survey results and votes cast on ballot initiatives—Americans overwhelmingly support new privacy protections. In general, the country has grown more concerned about how the government uses our data, and a large majority of people say that we need more data privacy protections. 

People are angry because they care about privacy, not because privacy is dead.

Some people also say that kids these days don’t care about their privacy, but the ones that we’ve met think about privacy a lot. What’s more, they are fighting as hard as anyone to stop privacy-invasive bills like the Kids Online Safety Act. In our experience, the next generation cares intensely about protecting privacy, and they’re likely to have even more tools to do so. 

Laws are Making Their Way Around the World

Strong privacy laws don’t cover every American—yet. But take a look at just one example to see how things are improving: the California Consumer Privacy Act of 2018 (CCPA). The CCPA isn’t perfect, but it did make a difference. The CCPA granted Californians a few basic rights when it comes to their relationship with businesses, like the right to know what information companies have about you, the right to delete that information, and the right to tell companies not to sell your information. 

This wasn’t a perfect law for a few reasons. Under the CCPA, consumers have to go company-by-company to opt out in order to protect their data. At EFF, we’d like to see privacy and protection as the default until consumers opt-in. Also, CCPA doesn’t allow individuals to sue if their data is mismanaged—only California’s Attorney General and the California Privacy Protection Agency can do it. And of course, the law only covers Californians. 

Remember that it takes time to change the system.

But this imperfect law is slowly getting better. Just this year California’s legislature passed the DELETE Act, which resolves one of those issues. The California Privacy Protection Agency now must create a deletion mechanism for data brokers that allows people to make their requests to every data broker with a single, verifiable consumer request. 

Pick a privacy-related topic, and chances are good that model bills are being introduced, or already exist as laws in some places, even if they don’t exist everywhere. The Illinois Biometric Information Privacy Act, for example, passed back in 2008, protects people from nonconsensual use of their biometrics for face recognition. We may not have comprehensive privacy laws yet in the US, but other parts of the world—like Europe—have more impactful, if imperfect, laws. We can have a nationwide comprehensive consumer data privacy law, and once those laws are on the books, they can be improved.  

We Know We’re Playing the Long Game

Remember that it takes time to change the system. Today we take many protections for granted, and often assume that things are only getting worse, not better. But many important rights are relatively new. For example, our Constitution didn’t always require police to get a warrant before wiretapping our phones. It took the Supreme Court four decades to get this right. (They were wrong in 1928 in Olmstead, then right in 1967 in Katz.)

Similarly, creating privacy protections in law and in technology is not a sprint. It is a marathon. The fight is long, and we know that. Below, we’ve got examples of the progress that we’ve already made, in law and elsewhere. 

Just because we don’t have some protective laws today doesn’t mean we can’t have them tomorrow. 

Privacy Protections Have Actually Increased Over the Years

The World Wide Web is Now Encrypted 

When the World Wide Web was created, most websites were unencrypted. Privacy laws aren’t the only way to create privacy protections, as the now nearly-entirely encrypted web shows:  another approach is to engineer in strong privacy protections from the start. 

The web has now largely switched from non-secure HTTP to the more secure HTTPS protocol. Before this happened, most web browsing was vulnerable to eavesdropping and content hijacking. HTTPS fixes most of these problems. That's why EFF, and many like-minded supporters, pushed for web sites to adopt HTTPS by default. As of 2021, about 90% of all web page visits use HTTPS. This switch happened in under a decade. This is a big win for encryption and security for everyone, and EFF's Certbot and HTTPS Everywhere are tools that made it happen, by offering an easy and free way to switch an existing HTTP site to HTTPS. (With a lot of help from Let’s Encrypt, started in 2013 by a group of determined researchers and technologists from EFF and the University of Michigan.) Today, it’s the default to implement HTTPS. 

Cell Phone Location Data Now Requires a Warrant

In 2018, the Supreme Court handed down a landmark opinion in Carpenter v. United States, ruling 5-4 that the Fourth Amendment protects cell phone location information. As a result, police must now get a warrant before obtaining this data. 

But where else this ruling applies is still being worked out. Perhaps the most significant part of the ruling is its explicit recognition that individuals can maintain an expectation of privacy in information that they provide to third parties. The Court termed that a “rare” case, but it’s clear that other invasive surveillance technologies, particularly those that can track individuals through physical space, are now ripe for challenge. Expect to see much more litigation on this subject from EFF and our friends.

Americans’ Outrage At Unconstitutional Mass Surveillance Made A Difference

In 2013, government contractor Edward Snowden shared evidence confirming, among other things, that the United States government had been conducting mass surveillance on a global scale, including surveillance of its own citizens’ telephone and internet use. Ten years later, there is definitely more work to be done regarding mass surveillance. But some things are undoubtedly better: some of the National Security Agency’s most egregiously illegal programs and authorities have shuttered or been forced to end. The Intelligence Community has started affirmatively releasing at least some important information, although EFF and others have still had to fight some long Freedom of Information Act (FOIA) battles.

Privacy Options Are So Much Better Today

Remember PGP and GPG? If you do, you know that generally, there are much easier ways to send end-to-end encrypted communications today than there used to be. It’s fantastic that people worked so hard to protect their privacy in the past, and it’s fantastic that they don’t have to work as hard now! (If you aren’t familiar with PGP or GPG, just trust us on this one.) 

Don’t give in to privacy nihilism. Instead, share and celebrate the ways we’re winning. 

Advice for protecting online privacy used to require epic how-to guides for complex tools; now, advice is usually just about what relatively simple tools or settings to use. People across the world have Signal and WhatsApp. The web is encrypted, and the Tor Browser lets people visit websites anonymously fairly easily. Password managers protect your passwords and your accounts; third-party cookie blockers like EFF’s Privacy Badger stop third-party tracking. There are even options now to turn off your Ad ID—the key that enables most third-party tracking on mobile devices—right on your phone. These tools and settings all push the needle forward.

We Are Winning The Privacy War, Not Losing It

Sometimes people respond to privacy dangers by comparing them to sci-fi dystopias. But be honest: most science fiction dystopias still scare the heck out of us because they are much, much more invasive of privacy than the world we live in. 

In an essay called “Stop Saying Privacy Is Dead,” Evan Selinger makes a necessary point: “As long as you have some meaningful say over when you are watched and can exert agency over how your data is processed, you will have some modicum of privacy.” 

Of course we want more than a modicum of privacy. But the point here is that many of us generally do get to make decisions about our privacy. Not all—of course. But we all recognize that there are different levels of privacy in different places, and that privacy protections aren’t equally good or bad no matter where we go. We have places we can go—online and off—that afford us more protections than others. And because of this, most of the people reading this still have deep private lives, and can choose, with varying amounts of effort, not to allow corporate or government surveillance into those lives. 

Worrying about every potential threat, and trying to protect yourself from each of them, all of the time, is a recipe for failure.

Privacy is a process, not a single thing. We are always negotiating what levels of privacy we have. We might not always have the upper hand, but we are often able to negotiate. This is why we still see some fictional dystopias and think, “Thank God that’s not my life.” As long as we can do this, we are winning. 

“Giving Up” On Privacy May Not Mean Much to You, But It Does to Many

Shrugging about the dangers of surveillance can seem reasonable when that surveillance isn’t very impactful on our lives. But for many, fighting for privacy isn't a choice, it is a means to survive. Privacy inequity is real; increasingly, money buys additional privacy protections. And if privacy is available for some, then it can exist for all. But we should not accept that some people will have privacy and others will not. This is why digital privacy legislation is digital rights legislation, and why EFF is opposed to data dividends and pay-for-privacy schemes.

Privacy increases for all of us when it increases for each of us. It is much easier for a repressive government to ban end-to-end encrypted messengers when only journalists and activists use them. It is easier to know who is an activist or a journalist when they are the only ones using privacy-protecting services or methods. As the number of people demanding privacy increases, the safer we all are. Sacrificing others because you don't feel the impact of surveillance is a fool's bargain. 

Time Heals Most Privacy Wounds

You may want to tell yourself: companies already know everything about me, so a privacy law a year from now won't help. That's incorrect, because companies are always searching for new data. Some pieces of information will never change, like our biometrics. But chances are you've changed in many ways over the years—whether that's as big as a major life event or as small as a change in your tastes in movies—but who you are today is not necessarily you'll be tomorrow.

As the source of that data, we should have more control over where it goes, and we’re slowly getting it. But that expiration date means that even if some of our information is already out there, it’s never going to be too late to shut off the faucet. So if we pass a privacy law next year, it’s not the case that every bit of information about you has already leaked, so it won’t do any good. It will.

What To Do When You Feel Like It’s Impossible

It can feel overwhelming to care about something that feels like it’s dying a death of a thousand cuts. But worrying about every potential threat, and trying to protect yourself from each of them, all of the time, is a recipe for failure. No one really needs to be vigilant about every threat at all times. That’s why our recommendation is to create a personalized security plan, rather than throwing your hands up or cowering in a corner. 

Once you’ve figured out what threats you should worry about, our advice is to stay involved. We are all occasionally skeptical that we can succeed, but taking action is a great way to get rid of that gnawing feeling that there’s nothing to be done. EFF regularly launches new projects that we hope will help you fight privacy nihilism. We’re in court many times a year fighting privacy violations. We create ways for like-minded, privacy-focused people to work together in their local advocacy groups, through the Electronic Frontier Alliance, our grassroots network of community and campus organizations fighting for digital rights. We even help you teach others to protect their own privacy. And of course every day is a good day for you to join us in telling government officials and companies that privacy matters. 

We know we can win because we’re creating the better future that we want to see every day, and it’s working. But we’re also building the plane while we’re flying it. Just as the death of privacy is not inevitable, neither is our success. It takes real work, and we hope you’ll help us do that work by joining us. Take action. Tell a friend. Download Privacy Badger. Become an EFF member. Gift an EFF membership to someone else.

Don’t give in to privacy nihilism. Instead, share and celebrate the ways we’re winning. 

Privacy Badger Puts You in Control of Widgets

10 January 2024 at 09:34

The latest version of Privacy Badger 1 replaces embedded tweets with click-to-activate placeholders. This is part of Privacy Badger's widget replacement feature, where certain potentially useful widgets are blocked and then replaced with placeholders. This protects privacy by default while letting you restore the original widget whenever you want it or need it for the page to function.

Websites often include external elements such as social media buttons, comments sections, and video players. Although potentially useful, these “widgets” often track your behavior. The tracking happens regardless of whether you click on the widget. If you see a widget, the widget sees you back.

This is where Privacy Badger's widget replacement comes in. When blocking certain social buttons and other potentially useful widgets, Privacy Badger replaces them with click-to-activate placeholders. You will not be tracked by these replacements unless you explicitly choose to activate them.

A screenshot of Privacy Badger’s widget placeholder. The text inside the placeholder states that “Privacy Badger has replaced this X (Twitter) widget”. The words “this X (Twitter) widget” are a link. There are two buttons inside the placeholder, “Allow once” and “Always allow on this site.”

Privacy Badger’s placeholders tell you exactly what happened while putting you in control.

Changing the UI of a website is a bold move for a browser extension to do. That’s what Privacy Badger is all about though: making strong choices on behalf of user privacy and revealing how that privacy is betrayed by businesses online.

Privacy Badger isn’t the first software to replace embedded widgets with placeholders for privacy or security purposes. As early as 2004, users could install Flashblock, an extension that replaced embedded Adobe Flash plugin content, a notoriously insecure technology.

A screenshot of Flashblock’s Flash plugin placeholder.

Flashblock’s Flash plugin placeholders lacked user-friendly buttons but got the (Flash blocking) job done.

Other extensions and eventually, even browsers, followed Flashblock in offering similar plugin-blocking placeholders. The need to do this declined as plugin use dropped over time, but a new concern rose to prominence. Privacy was under attack as social media buttons started spreading everywhere.

This brings us to ShareMeNot. Developed in 2012 as a research tool to investigate how browser extensions might enforce privacy on behest of the user, ShareMeNot replaced social media “share” buttons with click-to-activate placeholders. In 2014, ShareMeNot became a part of Privacy Badger. While the emphasis has shifted away from social media buttons to interactive widgets like video players and comments sections, Privacy Badger continues to carry on ShareMeNot's legacy.

Unfortunately, widget replacement is not perfect. The placeholder’s buttons may not work sometimes, or the placeholder may appear in the wrong place or may fail to appear at all. We will keep fixing and improving widget replacement. You can help by letting us know when something isn’t working right.

A screenshot of Privacy Badger’s popup. Privacy Badger’s browser toolbar icon as well as the “Report broken site” button are highlighted.

To report problems, first click on Privacy Badger’s icon in your browser toolbar. Privacy Badger’s “popup” window will open. Then, click the Report broken site button in the popup.

Pro tip #1: Because our YouTube replacement is not quite ready to be enabled by default, embedded YouTube players are not yet blocked or replaced. If you like though, you can try our YouTube replacement now.

A screenshot of Privacy Badger’s options page with the Tracking Domains tab selected. The list of tracking domains was filtered for “youtube.com”; the slider for youtube.com was moved to the “Block entirely” position.

To opt in, visit Privacy Badger's options page, select the “Tracking Domains” tab, search for “youtube.com”, and move the toggle for youtube.com to the Block entirely position.

Pro tip #2: The most private way to activate a replaced widget is to use the this [YouTube] widget link (inside the Privacy Badger has replaced this [YouTube] widget text), when the link is available. Going through the link, as opposed to one of the Allow buttons, means the widget provider doesn't necessarily get to know what site you activated the widget on. You can also right-click the link to save the widget URL; no need to visit the link or to use browser developer tools.

A screenshot of Privacy Badger’s widget placeholder. The “this YouTube widget” link is highlighted.

Click the link to open the widget in a new tab.

Privacy tools should be measured not only by efficacy, but also ease of use. As we write in the FAQ, we want Privacy Badger to function well without any special knowledge or configuration by the user. Privacy should be made easy, rather than gatekept for “power users.” Everyone should be able to decide for themselves when and with whom they want to share information. Privacy Badger fights to restore this control, biting back at sneaky non-consensual surveillance.

To install Privacy Badger, visit privacybadger.org. Thank you for using Privacy Badger!

 

  • 1. Privacy Badger version 2023.12.1

❌
❌