Normal view

Received before yesterday

Police officer in spycops scandal deceived two women at same time, inquiry told

10 December 2025 at 08:53

Mark Jenner began five-year relationship with Alison, a leftwing activist, while under cover spying on political campaigners

An undercover police officer deceived two women at the same time over many years in a sustained betrayal of both of them, the spycops public inquiry has heard.

Mark Jenner had a relationship with a leftwing activist, known as Alison, for five years without disclosing to her that he was in reality an undercover officer who was spying on political campaigners.

Continue reading...

© Photograph: unknown

© Photograph: unknown

© Photograph: unknown

Axon Tests Face Recognition on Body-Worn Cameras

3 December 2025 at 19:00

Axon Enterprise Inc. is working with a Canadian police department to test the addition of face recognition technology (FRT) to its body-worn cameras (BWCs). This is an alarming development in government surveillance that should put communities everywhere on alert. 

As many as 50 officers from the Edmonton Police Department (EPD) will begin using these FRT-enabled BWCs today as part of a proof-of-concept experiment. EPD is the first police department in the world to use these Axon devices, according to a report from the Edmonton Journal

This kind of technology could give officers instant identification of any person that crosses their path. During the current trial period, the Edmonton officers will not be notified in the field of an individual’s identity but will review identifications generated by the BWCs later on. 

“This Proof of Concept will test the technology’s ability to work with our database to make officers aware of individuals with safety flags and cautions from previous interactions,” as well as “individuals who have outstanding warrants for serious crime,” Edmonton Police described in a press release, suggesting that individuals will be placed on a watchlist of sorts.

FRT brings a rash of problems. It relies on extensive surveillance and collecting images on individuals, law-abiding or otherwise. Misidentifications can cause horrendous consequences for individuals, including prolonged and difficult fights for innocence and unfair incarceration for crimes never committed. In a world where police are using real-time face recognition, law-abiding individuals or those participating in legal, protected activity that police may find objectionable — like protest — could be quickly identified. 

With the increasing connections being made between disparate data sources about nearly every person, BWCs enabled with FRT can easily connect a person minding their own business, who happens to come within view of a police officer, with a whole slew of other personal information. 

Axon had previously claimed it would pause the addition of face recognition to its tools due to concerns raised in 2019 by the company’s AI and Policing Technology Ethics Board. However, since then, the company has continued to research and consider the addition of FRT to its products. 

This BWC-FRT integration signals possible other FRT integrations in the future. Axon is building an entire arsenal of cameras and surveillance devices for law enforcement, and the company grows the reach of its police surveillance apparatus, in part, by leveraging relationships with its thousands of customers, including those using its flagship product, the Taser. This so-called “ecosystem” of surveillance technologyq includes the Fusus system, a platform for connecting surveillance cameras to facilitate real-time viewing of video footage. It also involves expanding the use of surveillance tools like BWCs and the flying cameras of “drone as first responder” (DFR) programs.

Face recognition undermines individual privacy, and it is too dangerous when deployed by police. Communities everywhere must move to protect themselves and safeguard their civil liberties, insisting on transparency, clear policies, public accountability, and audit mechanisms. Ideally, communities should ban police use of the technology altogether. At a minimum, police must not add FRT to BWCs.

How to Identify Automated License Plate Readers at the U.S.-Mexico Border

2 December 2025 at 11:23

U.S. Customs and Border Protection (CBP), the Drug Enforcement Administration (DEA), and scores of state and local law enforcement agencies have installed a massive dragnet of automated license plate readers (ALPRs) in the US-Mexico borderlands. 

In many cases, the agencies have gone out of their way to disguise the cameras from public view. And the problem is only going to get worse: as recently as July 2025, CBP put out a solicitation to purchase 100 more covert trail cameras with license plate-capture ability. 

Last month, the Associated Press published an in-depth investigation into how agencies have deployed these systems and exploited this data to target drivers. But what do these cameras look like? Here's a guide to identifying ALPR systems when you're driving the open road along the border.

Special thanks to researcher Dugan Meyer and AZ Mirror's Jerod MacDonald-Evoy. All images by EFF and Meyer were taken within the last three years. 

ALPR at Checkpoints and Land Ports of Entry 

All land ports of entry have ALPR systems that collect all vehicles entering and exiting the country. They typically look like this: 

License plate readers along the lanes leading into a border crossing

ALPR systems at the Eagle Pass International Bridge Port of Entry. Source: EFF

Most interior checkpoints, which are anywhere from a few miles to more than 60 from the border, are also equipped with ALPR systems operated by CBP. However, the DEA operates a parallel system at most interior checkpoints in southern border states. 

When it comes to checkpoints, here's the rule of thumb: If you're traveling away from the border, you are typically being captured by a CBP/Border Patrol system (Border Patrol is a sub-agency of CBP). If you're traveling toward the border, it is most likely a DEA system.

Here's a representative example of a CBP checkpoint camera system:

ALPR cameras next to white trailers along the lane into a checkpoint

ALPR system at the Border Patrol checkpoint near Uvalde, Texas. Source: EFF

At a typical port of entry or checkpoint, each vehicle lane will have an ALPR system. We've even seen border patrol checkpoints that were temporarily closed continue to funnel people through these ALPR lanes, even though there was no one on hand to vet drivers face-to-face. According CBP's Privacy Impact Assessments (2017, 2020), CBP keeps this data for 15 years, but generally agents can only search the most recent five years worth of data. 

The scanners were previously made by a company called Perceptics which was infamously hacked, leading to a breach of driver data. The systems have since been "modernized" (i.e. replaced) by SAIC.

Here's a close up of the new systems:

Close up of a camera marked "Front."

Frontal ALPR camera at the checkpoint near Uvalde, Texas. Source: EFF

In 2024, the DEA announced plans to integrate port of entry ALPRs into its National License Plate Reader Program (NLPRP), which the agency says is a network of both DEA systems and external law enforcement ALPR systems that it uses to investigate crimes such as drug trafficking and bulk cash smuggling.

Again, if you're traveling towards the border and you pass a checkpoint, you're often captured by parallel DEA systems set up on the opposite side of the road. However, these systems have also been found to be installed on their own away from checkpoints. 

These are a major component of the DEA's NLPRP, which has a standard retention period of 90 days. This program dates back to at least 2010, according to records obtained by the ACLU. 

Here is a typical DEA system that you will find installed near existing Border Patrol checkpoints:

A series of cameras next to a trailer by the side of the road.

DEA ALPR set-up in southern Arizona. Source: EFF

These are typically made by a different vendor, Selex ES, which also includes the brands ELSAG and Leonardo. Here is a close-up:

Close-up of an ALPR cameras

Close-up of a DEA camera near the Tohono O'odham Nation in Arizona. Source: EFF

Covert ALPR

As you drive along border highways, law enforcement agencies have disguised cameras in order to capture your movements. 

The exact number of covert ALPRs at the border is unknown, but to date we have identified approximately 100 sites. We know CBP and DEA each operate covert ALPR systems, but it isn't always possible to know which agency operates any particular set-up. 

Another rule of thumb: if a covert ALPR has a Motorola Solutions camera (formerly Vigilant Solutions) inside, it's likely a CBP system. If it has a Selex ES camera inside, then it is likely a DEA camera. 

Here are examples of construction barrels with each kind of camera: 

A camera hidden inside an orange traffic barrell

A covert ALPR with a Motorola Solutions ALPR camera near Calexico, Calif. Source: EFF

These are typically seen along the roadside, often in sets of three, but almost always connected to some sort of solar panel. They are often placed behind existing barriers.

A camera hidden inside an orange traffic barrel

A covert ALPR with a Selex ES camera in southern Arizona. Source: EFF

The DEA models are also found by the roadside, but they also can be found inside or near checkpoints. 

If you're curious (as we were), here's what they look like inside, courtesy of the US Patent and Trademark Office:

Patent drawings showing a traffic barrel and the camera inside it

Patent for portable covert license plate reader. Source: USPTO

In addition to orange construction barrels, agencies also conceal ALPRs in yellow sandbarrels. For example, these can be found throughout southern Arizona, especially in the southeastern part of the state.

A camera hidden in a yellow sand barrel.

A covert ALPR system in Arizona. Source: EFF

ALPR Trailers

Sometimes a speed trailer or signage trailer isn't designed so much for safety but to conceal ALPR systems. Sometimes ALPRs are attached to indistinct trailers with no discernible purpose that you'd hardly notice by the side of the road. 

It's important to note that its difficult to know who these belong to, since they aren't often marked. We know that all levels of government, even in the interior of the country, have purchased these set ups.  

Here are some of the different flavors of ALPR trailers:

A speed trailer capturing ALPR. Speed limit 45 sign.

An ALPR speed trailer in Texas. Source: EFF

A white flat trailer by the side of the road with camera portals on either end.

ALPR trailer in Southern California. Source. EFF

An orange trailer with an ALPR camera and a solar panel.

ALPR trailer in Southern California. Source. EFF

An orange trailer with ALPR cameras by the side of the road.

An ALPR unit in southern Arizona. Source: EFF

A trailer with a pole with mounted ALPR cameras in the desert.

ALPR unit in southern Arizona. Source: EFF

A trailer with a solar panel and an ALPR camera.

A Jenoptik Vector ALPR trailer in La Joya, Texas. Source: EFF

One particularly worrisome version of an ALPR trailer is the Jenoptik Vector: at least two jurisdictions along the border have equipped these trailers not only with ALPR, but with TraffiCatch technology that gathers Bluetooth and Wi-Fi identifiers. This means that in addition to gathering plates, these devices would also document mobile devices, such as phones, laptops, and even vehicle entertainment systems.

Stationary ALPR 

Stationary or fixed ALPR is one of the more traditional ways of installing these systems. The cameras are placed on existing utility poles or other infrastructure or on poles installed by the ALPR vendor. 

For example, here's a DEA system installed on a highway arch:

The back of a highway overpass sign with ALPR cameras.

The lower set of ALPR cameras belong to the DEA. Source: Dugan Meyer CC BY

A camera and solar panel attached to a streetlight pole.

ALPR camera in Arizona. Source: Dugan Meyer CC BY

Flock Safety

At the local level, thousands of cities around the United States have adopted fixed ALPR, with the company Flock Safety grabbing a huge chunk of the market over the last few years. County sheriffs and municipal police along the border have also embraced the trend, with many using funds earmarked for border security to purchase these systems. Flock allows these agencies to share with one another and contribute their ALPR scans to a national pool of data. As part of a pilot program, Border Patrol had access to this ALPR data for most of 2025. 

A typical Flock Safety setup involves attaching cameras and solar panels to poles. For example:

A red truck passed a pair of Flock Safety ALPR cameras on poles.

Flock Safety ALPR poles installed just outside the Tohono O'odham Nation in Arizona. Source: EFF

A black Flock Safety camera with a small solar panel

A close-up of a Flock Safety camera in Douglas, Arizona. Source: EFF

We've also seen these camera poles placed outside the Santa Teresa Border Patrol station in New Mexico.

Flock may now be the most common provider nationwide, but it isn't the only player in the field. DHS recently released a market survey of 16 different vendors providing similar technology.  

Mobile ALPR 

ALPR cameras can also be found attached to patrol cars. Here's an example of a Motorola Solutions ALPR attached to a Hidalgo County Constable vehicle in South Texas:

An officer stands beside patrol car. Red circle identifies mobile ALPR

Mobile ALPR on a Hidalgo County Constable vehicle. Source: Weslaco Police Department

These allow officers not only to capture ALPR data in real time as they are driving along, but they will also receive an in-car alert when a scan matches a vehicle on a "hot list," the term for a list of plates that law enforcement has flagged for further investigation. 

Here's another example: 

A masked police officer stands next to a patrol vehicle with two ALPR cameras.

Mobile ALPR in La Mesa, Calif.. Source: La Mesa Police Department Facebook page

Identifying Other Technologies 

EFF has been documenting the wide variety of technologies deployed at the border, including surveillance towers, aerostats, and trail cameras. To learn more, download EFF's zine, "Surveillance Technology at the US-Mexico Border" and explore our map of border surveillance, which includes Google Streetview links so you can see exactly how each installation looks on the ground. Currently we have mapped out most DEA and CBP checkpoint ALPR setups, with covert cameras planned for addition in the near future.

Rights Organizations Demand Halt to Mobile Fortify, ICE's Handheld Face Recognition Program

26 November 2025 at 09:46

Mobile Fortify, the new app used by Immigration and Customs Enforcement (ICE) to use face recognition technology (FRT) to identify people during street encounters, is an affront to the rights and dignity of migrants and U.S. citizens alike. That's why a coalition of privacy, civil liberties and civil rights organizations are demanding the Department of Homeland Security (DHS) shut down the use of Mobile Fortify, release the agency's privacy analyses of the app, and clarify the agency's policy on face recognition. 

As the organizations, including EFF, Asian Americans Advancing Justice and the Project on Government Oversight, write in a letter sent by EPIC

ICE’s reckless field practices compound the harm done by its use of facial recognition. ICE does not allow people to opt-out of being scanned, and ICE agents apparently have the discretion to use a facial recognition match as a definitive determination of a person’s immigration status even in the face of contrary evidence.  Using face identification as a definitive determination of immigration status is immensely disturbing, and ICE’s cavalier use of facial recognition will undoubtedly lead to wrongful detentions, deportations, or worse.  Indeed, there is already at least one reported incident of ICE mistakenly determining a U.S. citizen “could be deported based on biometric confirmation of his identity.”

As if this dangerous use of nonconsensual face recognition isn't bad enough, Mobile Fortify also queries a wide variety of government databases. Already there have been reports that federal officers may be using this FRT to target protesters engaging in First Amendment-protected activities. Yet ICE concluded it did not need to conduct a new Privacy Impact Assessment, which is standard practice for proposed government technologies that collect people's data. 

While Mobile Fortify is the latest iteration of ICE’s mobile FRT, EFF has been tracking this type of technology for more than a decade. In 2013, we identified how a San Diego agency had distributed face recognition-equipped phones to law enforcement agencies across the region, including federal immigration officers. In 2019, EFF helped pass a law temporarily banning collecting biometric data with mobile devices, resulting in the program's cessation

We fought against handheld FRT then, and we will fight it again today. 

Huawei and Chinese Surveillance

26 November 2025 at 07:05

This quote is from House of Huawei: The Secret History of China’s Most Powerful Company.

“Long before anyone had heard of Ren Zhengfei or Huawei, Wan Runnan had been China’s star entrepreneur in the 1980s, with his company, the Stone Group, touted as “China’s IBM.” Wan had believed that economic change could lead to political change. He had thrown his support behind the pro-democracy protesters in 1989. As a result, he had to flee to France, with an arrest warrant hanging over his head. He was never able to return home. Now, decades later and in failing health in Paris, Wan recalled something that had happened one day in the late 1980s, when he was still living in Beijing.

Local officials had invited him to dinner.

This was unusual. He was usually the one to invite officials to dine, so as to curry favor with the show of hospitality. Over the meal, the officials told Wan that the Ministry of State Security was going to send agents to work undercover at his company in positions dealing with international relations. The officials cast the move to embed these minders as an act of protection for Wan and the company’s other executives, a security measure that would keep them from stumbling into unseen risks in their dealings with foreigners. “You have a lot of international business, which raises security issues for you. There are situations that you don’t understand,” Wan recalled the officials telling him. “They said, ‘We are sending some people over. You can just treat them like regular employees.'”

Wan said he knew that around this time, state intelligence also contacted other tech companies in Beijing with the same request. He couldn’t say what the situation was for Huawei, which was still a little startup far to the south in Shenzhen, not yet on anyone’s radar. But Wan said he didn’t believe that Huawei would have been able to escape similar demands. “That is a certainty,” he said.

“Telecommunications is an industry that has to do with keeping control of a nation’s lifeline…and actually in any system of communications, there’s a back-end platform that could be used for eavesdropping.”

It was a rare moment of an executive lifting the cone of silence surrounding the MSS’s relationship with China’s high-tech industry. It was rare, in fact, in any country. Around the world, such spying operations rank among governments’ closest-held secrets. When Edward Snowden had exposed the NSA’s operations abroad, he’d ended up in exile in Russia. Wan, too, might have risked arrest had he still been living in China.

Here are two book reviews.

How Cops Are Using Flock Safety's ALPR Network to Surveil Protesters and Activists

20 November 2025 at 18:58

It's no secret that 2025 has given Americans plenty to protest about. But as news cameras showed protesters filling streets of cities across the country, law enforcement officers—including U.S. Border Patrol agents—were quietly watching those same streets through different lenses: Flock Safety automated license plate readers (ALPRs) that tracked every passing car. 

Through an analysis of 10 months of nationwide searches on Flock Safety's servers, we discovered that more than 50 federal, state, and local agencies ran hundreds of searches through Flock's national network of surveillance data in connection with protest activity. In some cases, law enforcement specifically targeted known activist groups, demonstrating how mass surveillance technology increasingly threatens our freedom to demonstrate. 

Flock Safety provides ALPR technology to thousands of law enforcement agencies. The company installs cameras throughout their jurisdictions, and these cameras photograph every car that passes, documenting the license plate, color, make, model and other distinguishing characteristics. This data is paired with time and location, and uploaded to a massive searchable database. Flock Safety encourages agencies to share the data they collect broadly with other agencies across the country. It is common for an agency to search thousands of networks nationwide even when they don't have reason to believe a targeted vehicle left the region. 

Via public records requests, EFF obtained datasets representing more than 12 million searches logged by more than 3,900 agencies between December 2024 and October 2025. The data shows that agencies logged hundreds of searches related to the 50501 protests in February, the Hands Off protests in April, the No Kings protests in June and October, and other protests in between. 

The Tulsa Police Department in Oklahoma was one of the most consistent users of Flock Safety's ALPR system for investigating protests, logging at least 38 such searches. This included running searches that corresponded to a protest against deportation raids in February, a protest at Tulsa City Hall in support of pro-Palestinian activist Mahmoud Khalil in March, and the No Kings protest in June. During the most recent No Kings protests in mid-October, agencies such as the Lisle Police Department in Illinois, the Oro Valley Police Department in Arizona, and the Putnam County (Tenn.) Sheriff's Office all ran protest-related searches. 

While EFF and other civil liberties groups argue the law should require a search warrant for such searches, police are simply prompted to enter text into a "reason" field in the Flock Safety system. Usually this is only a few words–or even just one.

In these cases, that word was often just “protest.” 

Crime does sometimes occur at protests, whether that's property damage, pick-pocketing, or clashes between groups on opposite sides of a protest. Some of these searches may have been tied to an actual crime that occurred, even though in most cases officers did not articulate a criminal offense when running the search. But the truth is, the only reason an officer is able to even search for a suspect at a protest is because ALPRs collected data on every single person who attended the protest. 

Search and Dissent 

2025 was an unprecedented year of street action. In June and again in October, thousands across the country mobilized under the banner of the “No Kings” movement—marches against government overreach, surveillance, and corporate power. By some estimates, the October demonstrations ranked among the largest single-day protests in U.S. history, filling the streets from Washington, D.C., to Portland, OR. 

EFF identified 19 agencies that logged dozens of searches associated with the No Kings protests in June and October 2025. In some cases the "No Kings" was explicitly used, while in others the term "protest" was used but coincided with the massive protests.

Law Enforcement Agencies that Ran Searches Corresponding with "No Kings" Rallies

  • Anaheim Police Department, Calif.
  • Arizona Department of Public Safety
  • Beaumont Police Department, Texas
  • Charleston Police Department, SC
  • Flagler County Sheriff's Office, Fla.
  • Georgia State Patrol
  • Lisle Police Department, Ill.
  • Little Rock Police Department, Ark.
  • Marion Police Department, Ohio
  • Morristown Police Department, Tenn.
  • Oro Valley Police Department, Ariz.
  • Putnam County Sheriff's Office, Tenn.
  • Richmond Police Department, Va.
  • Riverside County Sheriff's Office, Calif.
  • Salinas Police Department, Calif.
  • San Bernardino County Sheriff's Office, Calif.
  • Spartanburg Police Department, SC
  • Tempe Police Department, Ariz.
  • Tulsa Police Department, Okla.
  • US Border Patrol

For example: 

  • In Washington state, the Spokane County Sheriff's Office listed "no kings" as the reason for three searches on June 15, 2025 [Note: date corrected]. The agency queried 95 camera networks, looking for vehicles matching the description of "work van," "bus" or "box truck." 
  • In Texas, the Beaumont Police Department ran six searches related to two vehicles on June 14, 2025, listing "KINGS DAY PROTEST" as the reason. The queries reached across 1,774 networks. 
  • In California, the San Bernardino County Sheriff's Office ran a single search for a vehicle across 711 networks, logging "no king" as the reason. 
  • In Arizona, the Tempe Police Department made three searches for "ATL No Kings Protest" on June 15, 2025 searching through 425 networks. "ATL" is police code for "attempt to locate." The agency appears to not have been looking for a particular plate, but for any red vehicle on the road during a certain time window.

But the No Kings protests weren't the only demonstrations drawing law enforcement's digital dragnet in 2025. 

For example:

  • In Nevada's state capital, the Carson City Sheriff's Office ran three searches that correspond to the February 50501 Protests against DOGE and the Trump administration. The agency searched for two vehicles across 178 networks with "protest" as the reason.
  • In Florida, the Seminole County Sheriff's Office logged "protest" for five searches that correspond to a local May Day rally.
  • In Alabama, the Homewood Police Department logged four searches in early July 2025 for three vehicles with "PROTEST CASE" and "PROTEST INV." in the reason field. The searches, which probed 1,308 networks, correspond to protests against the police shooting of Jabari Peoples.
  • In Texas, the Lubbock Police Department ran two searches for a Tennessee license plate on March 15 that corresponds to a rally to highlight the mental health impact of immigration policies. The searches hit 5,966 networks, with the logged reason "protest veh."
  • In Michigan, Grand Rapids Police Department ran five searches that corresponded with the Stand Up and Fight Back Rally in February. The searches hit roughly 650 networks, with the reason logged as "Protest."

Some agencies have adopted policies that prohibit using ALPRs for monitoring activities protected by the First Amendment. Yet many officers probed the nationwide network with terms like "protest" without articulating an actual crime under investigation.

In a few cases, police were using Flock’s ALPR network to investigate threats made against attendees or incidents where motorists opposed to the protests drove their vehicle into crowds. For example, throughout June 2025, an Arizona Department of Public Safety officer logged three searches for “no kings rock threat,” and a Wichita (Kan.) Police Department officer logged 22 searches for various license plates under the reason “Crime Stoppers Tip of causing harm during protests.”

Even when law enforcement is specifically looking for vehicles engaged in potentially criminal behavior such as threatening protesters, it cannot be ignored that mass surveillance systems work by collecting data on everyone driving to or near a protestnot just those under suspicion.

Border Patrol's Expanding Reach 

As U.S. Border Patrol (USBP), ICE, and other federal agencies tasked with immigration enforcement have massively expanded operations into major cities, advocates for immigrants have responded through organized rallies, rapid-response confrontations, and extended presences at federal facilities. 

USBP has made extensive use of Flock Safety's system for immigration enforcement, but also to target those who object to its tactics. In June, a few days after the No Kings Protest, USBP ran three searches for a vehicle using the descriptor “Portland Riots.” 

USBP has made extensive use of Flock Safety's system for immigration enforcement, but also to target those who object to its tactics.

USBP also used the Flock Safety network to investigate a motorist who had “extended his middle finger” at Border Patrol vehicles that were transporting detainees. The motorist then allegedly drove in front of one of the vehicles and slowed down, forcing the Border Patrol vehicle to brake hard. An officer ran seven searches for his plate, citing "assault on agent" and "18 usc 111," the federal criminal statute for assaulting, resisting or impeding a federal officer. The individual was charged in federal court in early August. 

USBP had access to the Flock system during a trial period in the first half of 2025, but the company says it has since paused the agency's access to the system. However, Border Patrol and other federal immigration authorities have been able to access the system’s data through local agencies who have run searches on their behalf or even lent them logins

Targeting Animal Rights Activists

Law enforcement's use of Flock's ALPR network to surveil protesters isn't limited to large-scale political demonstrations. Three agencies also used the system dozens of times to specifically target activists from Direct Action Everywhere (DxE), an animal-rights organization known for using civil disobedience tactics to expose conditions at factory farms.

Delaware State Police queried the Flock national network nine times in March 2025 related to DxE actions, logging reasons such as "DxE Protest Suspect Vehicle." DxE advocates told EFF that these searches correspond to an investigation the organization undertook of a Mountaire Farms facility. 

Additionally, the California Highway Patrol logged dozens of searches related to a "DXE Operation" throughout the day on May 27, 2025. The organization says this corresponds with an annual convening in California that typically ends in a direct action. Participants leave the event early in the morning, then drive across the state to a predetermined but previously undisclosed protest site. Also in May, the Merced County Sheriff's Office in California logged two searches related to "DXE activity." 

As an organization engaged in direct activism, DxE has experienced criminal prosecution for its activities, and so the organization told EFF they were not surprised to learn they are under scrutiny from law enforcement, particularly considering how industrial farmers have collected and distributed their own intelligence to police.

The targeting of DxE activists reveals how ALPR surveillance extends beyond conventional and large-scale political protests to target groups engaged in activism that challenges powerful industries. For animal-rights activists, the knowledge that their vehicles are being tracked through a national surveillance network undeniably creates a chilling effect on their ability to organize and demonstrate.

Fighting Back Against ALPR 

Two Flock Safety cameras on a pole

ALPR systems are designed to capture information on every vehicle that passes within view. That means they don't just capture data on "criminals" but on everyone, all the timeand that includes people engaged in their First Amendment right to publicly dissent. Police are sitting on massive troves of data that can reveal who attended a protest, and this data shows they are not afraid to use it. 

Our analysis only includes data where agencies explicitly mentioned protests or related terms in the "reason" field when documenting their search. It's likely that scores more were conducted under less obvious pretexts and search reasons. According to our analysis, approximately 20 percent of all searches we reviewed listed vague language like "investigation," "suspect," and "query" in the reason field. Those terms could well be cover for spying on a protest, an abortion prosecution, or an officer stalking a spouse, and no one would be the wiser–including the agencies whose data was searched. Flock has said it will now require officers to select a specific crime under investigation, but that can and will also be used to obfuscate dubious searches. 

For protestors, this data should serve as confirmation that ALPR surveillance has been and will be used to target activities protected by the First Amendment. Depending on your threat model, this means you should think carefully about how you arrive at protests, and explore options such as by biking, walking, carpooling, taking public transportation, or simply parking a little further away from the action. Our Surveillance Self-Defense project has more information on steps you could take to protect your privacy when traveling to and attending a protest.

For local officials, this should serve as another example of how systems marketed as protecting your community may actually threaten the values your communities hold most dear. The best way to protect people is to shut down these camera networks.  

Everyone should have the right to speak up against injustice without ending up in a database. 

ARC Data Sale Scandal: Airlines’ Travel Records Used for Warrantless Surveillance

19 November 2025 at 04:18

ARC Data Sale

The ARC Data Sale to U.S. government agencies has come under intense scrutiny following reports of warrantless access to Americans’ travel records. After growing pressure from lawmakers, the Airlines Reporting Corporation (ARC), a data broker collectively owned by major U.S. airlines, has announced it will shut down its Travel Intelligence Program (TIP), a system that allowed federal agencies to search through hundreds of millions of passenger travel records without judicial oversight.

Lawmakers Question ARC Data Sale and Warrantless Access

Concerns over the ARC Data Sale intensified this week after a bipartisan group of lawmakers sent letters to nine airline CEOs urging them to stop the practice immediately. The letter cited reports that government agencies, including the Department of Homeland Security (DHS), the Internal Revenue Service (IRS), the Securities and Exchange Commission (SEC), and the FBI had been accessing ARC’s travel database without obtaining warrants or court orders. According to the lawmakers, ARC sold access to a system containing approximately 722 million ticket transactions covering 39 months of past and future travel data. This includes bookings made through more than 10,000 U.S.-based travel agencies, popular online travel portals like Expedia, Kayak, and Priceline, and even credit-card reward program bookings. Travel details in this database include a passenger’s name, itinerary, flight numbers, fare details, ticket numbers, and sometimes credit card digits used during the purchase. Documents released through public records requests show that the FBI received travel records from ARC based solely on written requests, bypassing the need for subpoenas. DHS described the database as “an unparalleled intelligence resource.”

IRS Admits Policy Violations in Handling Travel Data

A central point of concern is the revelation that the IRS accessed ARC’s travel database without conducting a legal review or completing a required Privacy Impact Assessment. Under the E-Government Act of 2002, federal agencies must complete such assessments before procuring systems that collect personal data. In a disclosure to Senator Ron Wyden, the IRS admitted it had purchased ARC’s airline data without meeting these requirements. The agency only completed the privacy assessment after receiving an oversight inquiry in 2025. It also confirmed that it had not initially reviewed whether accessing the travel data constituted a search that required a warrant, despite previous commitments to do so after a 2021 investigation into cell-phone location data purchases.

Prospective Surveillance Raises New Privacy Concerns

Beyond historical travel data, lawmakers highlighted that ARC’s tools enabled what they termed “prospective surveillance.” Through automated, recurring searches, government agencies could receive alerts the moment a ticket matching specific criteria was booked. This type of forward-looking monitoring typically requires a higher legal threshold and is allowed only in limited circumstances authorized by Congress. Lawmakers argued that buying such capabilities from a data broker like ARC allowed agencies to circumvent the Fourth Amendment, undermining Americans’ constitutional protection against unreasonable searches. Because ARC only captures bookings made through travel agencies, individuals booking directly with airlines do not have their travel data in the system, effectively creating inconsistent privacy protections based solely on how a ticket is purchased.

ARC Confirms End of Travel Intelligence Program

In a letter sent on Tuesday, ARC CEO Lauri Reishus informed lawmakers that the company would end the Travel Intelligence Program in the coming weeks. The decision follows public and political pressure since September, when media reports first revealed the extent of ARC’s data-sharing arrangements with government agencies. Lawmakers noted that airlines benefit financially when passengers book tickets directly, raising concerns that the surveillance program not only threatened privacy rights but also created potential antitrust implications. As lawmakers push for stronger privacy protections and clearer limits on government surveillance, the ARC data sale case has become a high-profile example of how easily personal travel data can be accessed and shared without passengers’ knowledge.

Washington Court Rules That Data Captured on Flock Safety Cameras Are Public Records

12 November 2025 at 18:00

A Washington state trial court has shot down local municipalities’ effort to keep automated license plate reader (ALPR) data secret.

The Skagit County Superior Court in Washington rejected the attempt to block the public’s right to access data gathered by Flock Safety cameras, protecting access to information under the Washington Public Records Act (PRA). Importantly, the ruling from the court makes it clear that this access is protected even when a Washington city uses Flock Safety, a third-party vendor, to conduct surveillance and store personal data on behalf of a government agency.

"The Flock images generated by the Flock cameras...are public records," the court wrote in its ruling. "Flock camera images are created and used to further a governmental purpose. The Flock images created by the cameras located in Stanwood and Sedro-Woolley were paid for by Stanwood and Sedro Wooley [sic] and were generated for the benefit of Stanwood and Sedro-Woolley."

The cities’ move to exempt the records from disclosure was a dangerous attempt to deny transparency and reflects another problem with the massive amount of data that police departments collect through Flock cameras and store on Flock servers: the wiggle room cities seek when public data is hosted on a private company’s server.

Flock Safety's main product is ALPRs, camera systems installed throughout communities to track all drivers all the time. Privacy activists and journalists across the country recently have used public records requests to obtain data from the system, revealing a variety of controversial uses. This has included agencies accessing data for immigration enforcement and to investigate an abortion, the latter of which may have violated Washington law. A recent report from the University of Washington found that some cities in the state are also sharing the ALPR data from their Flock Safety systems with federal immigration agents. 

In this case, a member of the public in April filed a records request with a Flock customer, the City of Stanwood, for all footage recorded during a one-hour period in March. Shortly afterward, Stanwood and another Flock user, the City of Sedro-Woolley requested the local court rule that this data is not a public record, asserting that “data generated by Flock [automated license plate reader cameras (ALPRs)] and stored in the Flock cloud system are not public records unless and until a public agency extracts and downloads that data." 

If a government agency is conducting mass surveillance, EFF supports individuals’ access to data collected specifically on them, at the very least. And to address legitimate privacy concerns, governments can and should redact personal information in these records while still disclosing information about how the systems work and the data that they capture. 

This isn’t what these Washington cities offered, though. They tried a few different arguments against releasing any information at all. 

The contract between the City of Sedron-Woolley and Flock Safety clearly states that "As between Flock and Customer, all right, title and interest in the Customer Data, belong to and are retained solely by Customer,” and “Customer Data” is defined as "the data, media, and content provided by Customer through the Services. For the avoidance of doubt, the Customer Data will include the Footage." Other Flock-using police departments across the country have also relied on similar contract language to insist that footage captured by Flock cameras belongs to the jurisdiction in question. 

The contract language notwithstanding, officials in Washington attempted to restrict public access by claiming that video footage stored on Flock’s servers and requests for that information would constitute the generation of a new record. This part of the argument claimed that any information that was gathered but not otherwise accessed by law enforcement, including thousands of images taken every day by the agency’s 14 Flock ALPR cameras, had nothing to do with government business, would generate a new record, and should not be subject to records requests. The cities shut off their Flock cameras while the litigation was ongoing.

If the court had ruled in favor of the cities’ claim, police could move to store all their data — from their surveillance equipment and otherwise — on private company servers and claim that it's no longer accessible to the public. 

The cities threw another reason for withholding information at the wall to see if it would stick, claiming that even if the court found that data collected on Flock cameras are in fact public record, the cities should still be able to block the release of the requested one hour of footage either because all of the images captured by Flock cameras are sensitive investigation material or because they should be treated the same way as automated traffic safety cameras

EFF is particularly opposed to this line of reasoning. In 2017, the California Supreme Court sided with EFF and ACLU in a case arguing that “the license plate data of millions of law-abiding drivers, collected indiscriminately by police across the state, are not ‘investigative records’ that law enforcement can keep secret.” 

Notably, when Stanwood Police Chief Jason Toner made his pitch to the City Council to procure the Flock cameras in April 2024, he was adamant that the ALPRs would not be the same as traffic cameras. “Flock Safety Cameras are not ‘red light’ traffic cameras nor are they facial recognition cameras,” Chief Toner wrote at the time, adding that the system would be a “force multiplier” for the department. 

If the court had gone along with this part of the argument, cities could have been able to claim that the mass surveillance conducted using ALPRs is part of undefined mass investigations, pulling back from the public huge amounts of information being gathered without warrants or reason.

The cities seemed to be setting up contradictory arguments. Maybe the footage captured by the cities’ Flock cameras belongs to the city — or maybe it doesn’t until the city accesses it. Maybe the data collected by the cities’ taxpayer-funded cameras are unrelated to government business and should be inaccessible to the public — or maybe it’s all related to government business and, specifically, to sensitive investigations, presumably of every single vehicle that goes by the cameras. 

The requester, Jose Rodriguez, still won’t be getting his records, despite the court’s positive ruling. 

“The cities both allowed the records to be automatically deleted after I submitted my records requests and while they decided to have their legal council review my request. So they no longer have the records and can not provide them to me even though they were declared to be public records,” Rodriguez told 404 Media — another possible violation of that state’s public records laws. 

Flock Safety and its ALPR system have come under increased scrutiny in the last few months, as the public has become aware of illegal and widespread sharing of information. 

The system was used by the Johnson County Sheriff’s Office to track someone across the country who’d self-administered an abortion in Texas. Flock repeatedly claimed that this was inaccurate reporting, but materials recently obtained by EFF have affirmed that Johnson County was investigating that individual as part of a fetal death investigation, conducted at the request of her former abusive partner. They were not looking for her as part of a missing person search, as Flock said. 

In Illinois, the Secretary of State conducted an audit of Flock use within the state and found that the Flock Safety system was facilitating Customs and Border Protection access, in violation of state law. And in California, the Attorney General recently sued the City of El Cajon for using Flock to illegally share information across state lines.


Police departments are increasingly relying on third-party vendors for surveillance equipment and storage for the terabytes of information they’re gathering. Refusing the public access to this information undermines public records laws and the assurances the public has received when police departments set these powerful spying tools loose in their streets. While it’s great that these records remain public in Washington, communities around the country must be swift to reject similar attempts at blocking public access.

Meet NEO 1X: The Robot That Does Chores and Spies on You?

10 November 2025 at 00:00

The future of home robotics is here — and it’s a little awkward. Meet the NEO 1X humanoid robot, designed to help with chores but raising huge cybersecurity and privacy questions. We discuss what it can actually do, the risks of having an always-connected humanoid in your home, and why it’s definitely not the “Robot […]

The post Meet NEO 1X: The Robot That Does Chores and Spies on You? appeared first on Shared Security Podcast.

The post Meet NEO 1X: The Robot That Does Chores and Spies on You? appeared first on Security Boulevard.

💾

First Wap: A Surveillance Computer You’ve Never Heard Of

27 October 2025 at 07:08

Mother Jones has a long article on surveillance arms manufacturers, their wares, and how they avoid export control laws:

Operating from their base in Jakarta, where permissive export laws have allowed their surveillance business to flourish, First Wap’s European founders and executives have quietly built a phone-tracking empire, with a footprint extending from the Vatican to the Middle East to Silicon Valley.

It calls its proprietary system Altamides, which it describes in promotional materials as “a unified platform to covertly locate the whereabouts of single or multiple suspects in real-time, to detect movement patterns, and to detect whether suspects are in close vicinity with each other.”

Altamides leaves no trace on the phones it targets, unlike spyware such as Pegasus. Nor does it require a target to click on a malicious link or show any of the telltale signs (such as overheating or a short battery life) of remote monitoring.

Its secret is shrewd use of the antiquated telecom language Signaling System No. 7, known as SS7, that phone carriers use to route calls and text messages. Any entity with SS7 access can send queries requesting information about which cell tower a phone subscriber is nearest to, an essential first step to sending a text message or making a call to that subscriber. But First Wap’s technology uses SS7 to zero in on phone numbers and trace the location of their users.

Much more in this Lighthouse Reports analysis.

No Tricks, Just Treats 🎃 EFF’s Halloween Signal Stickers Are Here!

20 October 2025 at 16:37

EFF usually warns of new horrors threatening your rights online, but this Halloween we’ve summoned a few of our own we’d like to share.  Our new Signal Sticker Pack highlights some creatures—both mythical and terrifying—conjured up by our designers for you to share this spooky season.

If you’re new to Signal, it's a free and secure messaging app built by the nonprofit Signal Foundation at the forefront of defending user privacy. While chatting privately, you can add some seasonal flair with Signal Stickers, and rest assured: friends receiving them get the full sticker pack fully encrypted, safe from prying eyes and lurking spirits.

How To Get and Share Signal Stickers

On any mobile device or desktop with the Signal app installed, you can simply click the button below.

Download EFF's Signal Stickers

To share Frights and Rights  

You can also paste the sticker link directly into a signal chat, and then tap it to download the pack directly to the app.

Once they’re installed, they are even easier to share—simply open a chat, tap the sticker menu on your keyboard, and send one of EFF’s spooky stickers.  They’ll then be asked if they’d like to also have the sticker pack.

All of this works without any third parties knowing what sticker packs you have or whom you shared them with. Our little ghosts and ghouls are just between us.

Meet The Encryptids

 a banshee, bigfoot, and a ghost

These familiar champions of digital rights—The Encryptids—are back! Don’t let their monstrous looks fool you; each one advocates for privacy, security, and a dash of weirdness in their own way. Whether they’re shouting about online anonymity or the importance of interoperability, they’re ready to help you share your love for digital rights. Learn more about their stories here, and you can even grab a bigfoot pin to let everyone know that privacy is a “human” right.

Street-Level Surveillance Monsters

 a body worn camera, face recognition spider, and flying wraith

On a cool autumn night, you might be on the lookout for ghosts and ghouls from your favorite horror flicks—but in the real world, there are far scarier monsters lurking in the dark: police surveillance technologies. Often hidden in plain sight, these tools quietly watch from the shadows and are hard to spot. That’s why we’ve given these tools the hideous faces they deserve in our Street-Level Surveillance Monsters series, ready to scare (and inform) your loved ones.

Copyright Creatures

 including a copyright thief, copyright robots, and a troll

Ask any online creator and they’ll tell you: few things are scarier than a copyright takedown. From unfair DMCA claims and demonetization to frivolous lawsuits designed to intimidate people into a hefty payment, the creeping expansion of copyright can inspire as much dread as any monster on the big screen. That’s why this pack includes a few trolls and creeps straight from a broken copyright system—where profit haunts innovation. 

To that end, all of EFF’s work (including these stickers) are under an open CC-BY License, free for you to use and remix as you see fit.

Happy Haunting Everybody!

These frights may disappear with your message, but the fights persist. That’s why we’re so grateful to EFF supporters for helping us make the digital world a little more weird and a little less scary. You can become a member today and grab some gear to show your support. Happy Halloween!

DONATE TODAY

No One Should Be Forced to Conform to the Views of the State

16 October 2025 at 15:05

Should you have to think twice before posting a protest flyer to your Instagram story? Or feel pressure to delete that bald JD Vance meme that you shared? Now imagine that you could get kicked out of the country—potentially losing your job or education—based on the Trump administration’s dislike of your views on social media. 

That threat to free expression and dissent is happening now, but we won’t let it stand. 

"...they're not just targeting individuals—they're targeting the very idea of freedom itself."

The Electronic Frontier Foundation and co-counsel are representing the United Automobile Workers (UAW), Communications Workers of America (CWA), and American Federation of Teachers (AFT) in a lawsuit against the U.S. State Department and Department of Homeland Security for their viewpoint-based surveillance and suppression of noncitizens’ First Amendment-protected speech online.  The lawsuit asks a federal court to stop the government’s unconstitutional surveillance program, which has silenced citizens and noncitizens alike. It has even hindered unions’ ability to associate with their members. 

"When they spy on, silence, and fire union members for speaking out, they're not just targeting individuals—they're targeting the very idea of freedom itself,” said UAW President Shawn Fain. 

The Trump administration has built this mass surveillance program to monitor the constitutionally protected online speech of noncitizens who are lawfully present in the U.S. The program uses AI and automated technologies to scour social media and other online platforms to identify and punish individuals who express viewpoints the government considers "hostile" to "our culture" and "our civilization".  But make no mistake: no one should be forced to conform to the views of the state. 

The Foundation of Democracy 

Your free expression and privacy are fundamental human rights, and democracy crumbles without them. We have an opportunity to fight back, but we need you.  EFF’s team of lawyers, activists, researchers, and technologists have been on a mission to protect your freedom online since 1990, and we’re just getting started.

Donate and become a member of EFF today. Your support helps protect crucial rights, online and off, for everyone.

Give Today

Victory! California Requires Transparency for AI Police Reports

14 October 2025 at 13:44

California Governor Newsom has signed S.B. 524, a bill that begins the long process of regulating and imposing transparency on the growing problem of AI-written police reports. EFF supported this bill and has spent the last year vocally criticizing the companies pushing AI-generated police reports as a service. 

S.B.524 requires police to disclose, on the report, if it was used to fully or in part author a police report. Further, it bans vendors from selling or sharing the information a police agency provided to the AI. 

The bill is also significant because it required departments to retain the first draft of the report so that judges, defense attorneys, or auditors could readily see which portions of the final report were written by the officer and which portions were written by the computer. This creates major problems for police who use the most popular product in this space: Axon’s Draft One. By design, Draft One does not retain an edit log of who wrote what. Now, to stay in compliance with the law, police departments will either need Axon to change their product, or officers will have to take it upon themselves to go retain evidence of what the draft of their report looked like. Or, police can drop Axon’s Draft One all together. 

EFF will continue to monitor whether departments are complying with this state law.

After Utah, California has become the second state to pass legislation that begins to address this problem. Because of the lack of transparency surrounding how police departments buy and deploy technology, it’s often hard to know if police departments are using AI to write reports, how the generative AI chooses to translate audio to a narrative, and which portions of reports are written by AI and which parts are written by the officers. EFF has written a guide to help you file public records requests that might shed light on your police department’s use of AI to write police reports. 

It’s still unclear if products like Draft One run afoul of record retention laws, and how AI-written police reports will impact the criminal justice system. We will need to consider more comprehensive regulation and perhaps even prohibition of this use of generative AI. But S.B. 524 is a good first step. We hope that more states will follow California and Utah’s lead and pass even stronger bills.

The Trump Administration’s Increased Use of Social Media Surveillance

14 October 2025 at 07:09

This chilling paragraph is in a comprehensive Brookings report about the use of tech to deport people from the US:

The administration has also adapted its methods of social media surveillance. Though agencies like the State Department have gathered millions of handles and monitored political discussions online, the Trump administration has been more explicit in who it’s targeting. Secretary of State Marco Rubio announced a new, zero-tolerance “Catch and Revoke” strategy, which uses AI to monitor the public speech of foreign nationals and revoke visas of those who “abuse [the country’s] hospitality.” In a March press conference, Rubio remarked that at least 300 visas, primarily student and visitor visas, had been revoked on the grounds that visitors are engaging in activity contrary to national interest. A State Department cable also announced a new requirement for student visa applicants to set their social media accounts to public—reflecting stricter vetting practices aimed at identifying individuals who “bear hostile attitudes toward our citizens, culture, government, institutions, or founding principles,” among other criteria.

Flok License Plate Surveillance

8 October 2025 at 12:10

The company Flok is surveilling us as we drive:

A retired veteran named Lee Schmidt wanted to know how often Norfolk, Virginia’s 176 Flock Safety automated license-plate-reader cameras were tracking him. The answer, according to a U.S. District Court lawsuit filed in September, was more than four times a day, or 526 times from mid-February to early July. No, there’s no warrant out for Schmidt’s arrest, nor is there a warrant for Schmidt’s co-plaintiff, Crystal Arrington, whom the system tagged 849 times in roughly the same period.

You might think this sounds like it violates the Fourth Amendment, which protects American citizens from unreasonable searches and seizures without probable cause. Well, so does the American Civil Liberties Union. Norfolk, Virginia Judge Jamilah LeCruise also agrees, and in 2024 she ruled that plate-reader data obtained without a search warrant couldn’t be used against a defendant in a robbery case.

Hey, San Francisco, There Should be Consequences When Police Spy Illegally

3 October 2025 at 14:07

A San Francisco supervisor has proposed that police and other city agencies should have no financial consequences for breaking a landmark surveillance oversight law. In 2019, organizations from across the city worked together to help pass that law, which required law enforcement to get the approval of democratically elected officials before they bought and used new spying technologies. Bit by bit, the San Francisco Police Department and the Board of Supervisors have weakened that lawbut one important feature of the law remained: if city officials are caught breaking this law, residents can sue to enforce it, and if they prevail they are entitled to attorney fees. 

Now Supervisor Matt Dorsey believes that this important accountability feature is “incentivizing baseless but costly lawsuits that have already squandered hundreds of thousands of taxpayer dollars over bogus alleged violations of a law that has been an onerous mess since it was first enacted.” 

Between 2010 and 2023, San Francisco had to spend roughly $70 million to settle civil suits brought against the SFPD for alleged misconduct ranging from shooting city residents to wrongfully firing whistleblowers. This is not “squandered” money; it is compensating people for injury. We are all governed by laws and are all expected to act accordinglypolice are not exempt from consequences for using their power wrongfully. In the 21st century, this accountability must extend to using powerful surveillance technology responsibly. 

The ability to sue a police department when they violate the law is called a “private right of action” and it is absolutely essential to enforcing the law. Government officials tasked with making other government officials turn square corners will rarely have sufficient resources to do the job alone, and often they will not want to blow the whistle on peers. But city residents empowered to bring a private right of action typically cannot do the job alone, eitherthey need a lawyer to represent them. So private rights of action provide for an attorney fee award to people who win these cases. This is a routine part of scores of public interest laws involving civil rights, labor safeguards, environmental protection, and more.

Without an enforcement mechanism to hold police accountable, many will just ignore the law. They’ve done it before. AB 481 is a California state law that requires police to get elected official approval before attempting to acquire military equipment, including drones. The SFPD knowingly ignored this law. If it had an enforcement mechanism, more police would follow the rules. 

President Trump recently included San Francisco in a list of cities he would like the military to occupy. Law enforcement agencies across the country, either willingly or by compulsion, have been collaborating with federal agencies operating at the behest of the White House. So it would be best for cities to keep their co-optable surveillance infrastructure small, transparent, and accountable. With authoritarianism looming, now is not the time to make police less hard to controlespecially considering SFPD has already disclosed surveillance data to Immigration and Customs Enforcement (ICE) in violation of California state law.  

We’re calling on the Board of Supervisors to reject Supervisor Dorsey’s proposal. If police want to avoid being sued and forced to pay the prevailing party’s attorney fees, they should avoid breaking the laws that govern police surveillance in the city.

Flock’s Gunshot Detection Microphones Will Start Listening for Human Voices

2 October 2025 at 11:45

Flock Safety, the police technology company most notable for their extensive network of automated license plate readers spread throughout the United States, is rolling out a new and troubling product that may create headaches for the cities that adopt it: detection of “human distress” via audio. As part of their suite of technologies, Flock has been pushing Raven, their version of acoustic gunshot detection. These devices capture sounds in public places and use machine learning to try to identify gunshots and then alert police—but EFF has long warned that they are also high powered microphones parked above densely-populated city streets. Cities now have one more reason to follow the lead of many other municipalities and cancel their Flock contracts, before this new feature causes civil liberties harms to residents and headaches for cities. 

In marketing materials, Flock has been touting new features to their Raven product—including the ability of the device to alert police based on sounds, including “distress.” The online ad for the product, which allows cities to apply for early access to the technology, shows the image of police getting an alert for “screaming.” 

It’s unclear how this technology works. For acoustic gunshot detection, generally the microphones are looking for sounds that would signify gunshots (though in practice they often mistake car backfires or fireworks for gunshots). Flock needs to come forward now with an explanation of exactly how their new technology functions. It is unclear how these devices will interact with state “eavesdropping” laws that limit listening to or recording the private conversations that often take place in public. 

Flock is no stranger to causing legal challenges for the cities and states that adopt their products. In Illinois, Flock was accused of violating state law by allowing Immigration and Customs Enforcement (ICE), a federal agency, access to license plate reader data taken within the state. That’s not all. In 2023, a North Carolina judge halted the installation of Flock cameras statewide for operating in the state without a license. When the city of Evanston, Illinois recently canceled its contract with Flock, it ordered the company to take down their license plate readers–only for Flock to mysteriously reinstall them a few days later. This city has now sent Flock a cease and desist order and in the meantime, has put black tape over the cameras. For some, the technology isn’t worth its mounting downsides. As one Illinois village trustee wrote while explaining his vote to cancel the city’s contract with Flock, “According to our own Civilian Police Oversight Commission, over 99% of Flock alerts do not result in any police action.”


Gunshot detection technology is dangerous enough as it is—police showing up to alerts they think are gunfire only to find children playing with fireworks is a recipe for innocent people to get hurt. This isn’t hypothetical: in Chicago a child really was shot at by police who thought they were responding to a shooting thanks to a ShotSpotter alert. Introducing a new feature that allows these pre-installed Raven microphones all over cities to begin listening for human voices in distress is likely to open up a whole new can of unforeseen legal, civil liberties, and even bodily safety consequences.

EFF Urges Virginia Court of Appeals to Require Search Warrants to Access ALPR Databases

29 September 2025 at 12:51

This post was co-authored by EFF legal intern Olivia Miller.

For most Americans—driving is a part of everyday life. Practically speaking, many of us drive to work, school, play, and anywhere in between. Not only do we visit places that give insights into our personal lives, but we sometimes use vehicles as a mode of displaying information about our political beliefs, socioeconomic status, and other intimate details.

All of this personal activity can be tracked and identified through Automatic License Plate Reader (ALPR) data—a popular surveillance tool used by law enforcement agencies across the country. That’s why, in an amicus brief filed with the Virginia Court of Appeals, EFF, the ACLU of Virginia, and NACDL urged the court to require police to seek a warrant before searching ALPR data.

In Commonwealth v. Church, a police officer in Norfolk, Virginia searched license plate data without a warrant—not to prove that defendant Ronnie Church was at the scene of the crime, but merely to try to show he had a “guilty mind.” The lower court, in a one-page ruling relying on Commonwealth v. Bell, held this warrantless search violated the Fourth Amendment and suppressed the ALPR evidence. We argued the appellate court should uphold this decision.

Like the cellphone location data the Supreme Court protected in Carpenter v. United States, ALPR data threatens peoples’ privacy because it is collected indiscriminately over time and can provide police with a detailed picture of a person’s movements. ALPR data includes photos of license plates, vehicle make and model, any distinctive features of the vehicle, and precise time and location information. Once an ALPR logs a car’s data, the information is uploaded to the cloud and made accessible to law enforcement agencies at the local, state, and federal level—creating a near real-time tracking tool that can follow individuals across vast distances.

Think police only use ALPRs to track suspected criminals? Think again. ALPRs are ubiquitous; every car traveling into the camera’s view generates a detailed dataset, regardless of any suspected criminal activity. In fact, a survey of 173 law enforcement agencies employing ALPRs nationwide revealed that 99.5% of scans belonged to people who had no association to crime.

Norfolk County, Virginia, is home to over 170 ALPR cameras operated by Flock, a surveillance company that maintains over 83,000 ALPRs nationwide. The resulting surveillance network is so large that Norfolk county’s police chief suggested “it would be difficult to drive any distance and not be recorded by one.”

Recent and near-horizon advancements in Flock’s products will continue to threaten our privacy and further the surveillance state. For example, Flock’s ALPR data has been used for immigration raids, to track individuals seeking abortion-related care, to conduct fishing expeditions, and to identify relationships between people who may be traveling together but in different cars. With the help of artificial intelligence, ALPR databases could be aggregated with other information from data breaches and data brokers, to create “people lookup tools.” Even public safety advocates and law enforcement, like the International Association of Chiefs of Police, have warned that ALPR tech creates a risk “that individuals will become more cautious in their exercise of their protected rights of expression, protest, association, political participation because they consider themselves under constant surveillance.”  

This is why a warrant requirement for ALPR data is so important. As the Virginia trial court previously found in Bell, prolonged tracking of public movements with surveillance invades peoples’ reasonable expectation of privacy in the entirety of their movements. Recent Fourth Amendment jurisprudence, including Carpenter and Leaders of a Beautiful Struggle from the federal Fourth Circuit Court of Appeals favors a warrant requirement as well. Like the technologies at issue in those cases, ALPRs give police the ability to chronicle movements in a “detailed, encyclopedic” record, akin to “attaching an ankle monitor to every person in the city.”  

The Virginia Court of Appeals has a chance to draw a clear line on warrantless ALPR surveillance, and to tell Norfolk PD what the Fourth Amendment already says: come back with a warrant.

That Drone in the Sky Could Be Tracking Your Car

22 September 2025 at 08:00

Police are using their drones as flying automated license plate readers (ALPRs), airborne police cameras that make it easier than ever for law enforcement to follow you. 

"The Flock Safety drone, specifically, are flying LPR cameras as well,” Rahul Sidhu, Vice President of Aviation at Flock Safety, recently told a group of potential law enforcement customers interested in drone-as-first-responder (DFR) programs

The integration of Flock Safety’s flagship ALPR technology with its Aerodome drone equipment is a police surveillance combo poised to elevate the privacy threats to civilians caused by both of these invasive technologies as drone adoption expands. 

A slide from a Flock Safety presentation to Rutherford County Sheriff's Office in North Carolina, obtained via public records, featuring Flock Safety products, including the Aerodome drone and the Wing product, which helps convert surveillance cameras into ALPR systems

The use of DFR programs has grown exponentially. The biggest police technology companies, like Axon, Flock Safety, and Motorola Solutions, are broadening their drone offerings, anticipating that drones could become an important piece of their revenue stream. 

Communities must demand restrictions on how local police use drones and ALPRs, let alone a dangerous hybrid of the two. Otherwise, we can soon expect that a drone will fly to any call for service and capture sensitive location information about every car in its flight path, capturing more ALPR data to add to the already too large databases of our movements. 

ALPR systems typically rely on cameras that have been fixed along roadways or attached to police vehicles. These cameras capture the image of a vehicle, then use artificial intelligence technology to log the license plate, make, model, color, and other unique identifying information, like dents and bumper stickers. This information is usually stored on the manufacturer’s servers and often made available on nationwide sharing networks to police departments from other states and federal agencies, including Immigration and Customs Enforcement. ALPRs are already used by most of the largest police departments in the country, and Flock Safety also now offers the ability for an agency to turn almost any internet-enabled cameras into an ALPR camera. 

ALPRs present a host of problems. ALPR systems vacuum up data—like the make, model, color, and location of vehicles—on people who will never be involved in a crime, used in gridding areas to systematically make a record of when and where vehicles have been. ALPRs routinely make mistakes, causing police to stop the wrong car and terrorize the driver. Officers have abused law enforcement databases in hundreds of cases. Police have used them to track across state lines people seeking legal health procedures. Even when there are laws against sharing data from these tools with other departments, some policing agencies still do.

Drones, meanwhile, give police a view of roofs, backyards, and other fenced areas where cops can’t casually patrol, and their adoption is becoming more common. Companies that sell drones have been helping law enforcement agencies to get certifications from the Federal Aviation Authority (FAA), and recently-implemented changes to the restrictions on flying drones beyond the visual line of sight will make it even easier for police to add this equipment. According to the FAA, since a new DFR waiver process was implemented in May 2025, the FAA has granted more than 410 such waivers, already accounting for almost a third of the approximately 1,400 DFR waivers that have been granted since such programs began in 2018.

Local officials should, of course, be informed that the drones they’re buying are equipped to do such granular surveillance from the sky, but it is not clear that this is happening. While the ALPR feature is available as part of Flock drone acquisitions, some government customers may not realize that to approve a drone from Flock Safety may also mean approving a flying ALPR. And though not every Flock safety drone is currently running the ALPR feature, some departments, like Redondo Beach Police Department, have plans to activate it in the near future. 

ALPRs aren’t the only so-called payloads that can be added to a drone. In addition to the high resolution and thermal cameras with which drones can already be equipped, drone manufacturers and police departments have discussed adding cell-site simulators, weapons, microphones, and other equipment. Communities must mobilize now to keep this runaway surveillance technology under tight control.

When EFF posed questions to Flock Safety about the integration of ALPR and its drones, the company declined to comment.

Mapping, storing, and tracking as much personal information as possibleall without warrantsis where automated police surveillance is heading right now. Flock has previously described its desire to connect ALPR scans to additional information on the person who owns the car, meaning that we don’t live far from a time when police may see your vehicle drive by and quickly learn that it’s your car and a host of other details about you. 

EFF has compiled a list of known drone-using police departments. Find out about your town’s surveillance tools at the Atlas of Surveillance. Know something we don't? Reach out at aos@eff.org.

Details About Chinese Surveillance and Propaganda Companies

22 September 2025 at 07:03

Details from leaked documents:

While people often look at China’s Great Firewall as a single, all-powerful government system unique to China, the actual process of developing and maintaining it works the same way as surveillance technology in the West. Geedge collaborates with academic institutions on research and development, adapts its business strategy to fit different clients’ needs, and even repurposes leftover infrastructure from its competitors.

[…]

The parallels with the West are hard to miss. A number of American surveillance and propaganda firms also started as academic projects before they were spun out into startups and grew by chasing government contracts. The difference is that in China, these companies operate with far less transparency. Their work comes to light only when a trove of documents slips onto the internet.

[…]

It is tempting to think of the Great Firewall or Chinese propaganda as the outcome of a top-down master plan that only the Chinese Communist Party could pull off. But these leaks suggest a more complicated reality. Censorship and propaganda efforts must be marketed, financed, and maintained. They are shaped by the logic of corporate quarterly financial targets and competitive bids as much as by ideology­—except the customers are governments, and the products can control or shape entire societies.

More information about one of the two leaks.

EFF, ACLU to SFPD: Stop Illegally Sharing Data With ICE and Anti-Abortion States

18 September 2025 at 15:44

The San Francisco Police Department is the latest California law enforcement agency to get caught sharing automated license plate reader (ALPR) data with out-of-state and federal agencies. EFF and the ACLU of Northern California are calling them out for this direct violation of California law, which has put every driver in the city at risk and is especially dangerous for immigrants, abortion seekers, and other targets of the federal government.

This week, we sent the San Francisco Police Department a demand letter and request for records under the city’s Sunshine Ordinance following the SF Standard’s recent report that SFPD provided non-California agencies direct access to the city’s ALPR database. Reporters uncovered that at least 19 searches run by these agencies were marked as related to U.S. Immigration and Customs Enforcement (“ICE”). The city’s ALPR database was also searched by law enforcement agencies from Georgia and Texas, both states with severe restrictions on reproductive healthcare.

ALPRs are cameras that capture the movements of vehicles and upload the location of the vehicles to a searchable, shareable database. It is a mass surveillance technology that collects data indiscriminately on every vehicle on the road. As of September 2025, SFPD operates 415 ALPR cameras purchased from the company Flock Safety.

Since 2016, sharing ALPR data with out-of-state or federal agencies—for any reason—violates California law (SB 34). If this data is shared for the purpose of assisting with immigration enforcement, agencies violate an additional California law (SB 54).

In total, the SF Standard found that SFPD had allowed out-of-state cops to run 1.6 million searches of their data. “This sharing violated state law, as well as exposed sensitive driver location information to misuse by the federal government and by states that lack California’s robust privacy protections,” the letter explained.

EFF and ACLU are urging SFPD to launch a thorough audit of its ALPR database, institute new protocols for compliance, and assess penalties and sanctions for any employee found to be sharing ALPR information out of state.

“Your office reportedly claims that agencies outside of California are no longer able to access the SFPD ALPR database,” the letter says. “However, your office has not explained how outside agencies obtained access in the first place or how you plan to prevent future violations of SB 34 and 54.”

As we’ve demonstrated over and over again, many California agencies continue to ignore these laws, exposing sensitive location information to misuse and putting entire communities at risk. As federal agencies continue to carry out violent ICE raids, and many states enforce harsh, draconian restrictions on abortion, ALPR technology is already being used to target and surveil immigrants and abortion seekers. California agencies, including SFPD, have an obligation to protect the rights of Californians, even when those rights are not recognized by other states or the federal government.

See the full letter here: https://www.eff.org/files/2025/09/17/aclu_and_eff_letter_to_sfpd_9.16.2025-1.pdf

California, Tell Governor Newsom: Regulate AI Police Reports and Sign S.B. 524

16 September 2025 at 15:30

The California legislature has passed a necessary piece of legislation, S.B. 524, which starts to regulate police reports written by generative AI. Now, it’s up to us to make sure Governor Newsom will sign the bill. 

We must make our voices heard. These technologies obscure certain records and drafts from public disclosure. Vendors have invested heavily on their ability to sell police genAI. 

TAKE ACTION

AI-generated police reports are spreading rapidly. The most popular product on the market is Axon’s Draft One, which is already one of the country’s biggest purveyors of police tech, including body-worn cameras. By bundling their products together, Axon has capitalized on its customer base to spread their untransparent and potentially harmful genAI product. 

Many things can go wrong when genAI is used to write narrative police reports. First, because the product relies on body-worn camera audio, there’s a big chance of the AI draft missing context like sarcasm, culturally-specific or contextual vocabulary use and slang, languages other than English. While police are expected to edit the AI’s version of events to make up for these flaws, many officers will defer to the AI. Police are also supposed to make an independent decision before arresting a person who was identified by face recognition–and police mess that up all the time. The prosecutor of King County, Washington, has forbidden local officers from using Draft One out of fear that it is unreliable.
Then, of course, there’s the matter of dishonesty. Many public defenders and criminal justice practitioners have voiced concerns about what this technology would do to cross examination. If caught with a different story on the stand than the one in their police report, an officer can easily say, “the AI wrote that and I didn’t edit well enough.” The genAI creates a layer of plausible deniability. Carelessness is a very different offense than lying on the stand. 

To make matters worse, an investigation by EFF found that Axon’s Draft One product defies transparency by design. The technology is deliberately built to obscure what portion of a finished report was written by AI and which portions were written by an officer–making it difficult to determine if an officer is lying about which portions of a report were written by AI. 

But now, California has an important chance to join with other states like Utah that are passing laws to reign in these technologies, and what minimum safeguards and transparency must go along with using them. 

S.B. 524 does several important things: It mandates that police reports written by AI include disclaimers on every page or within the body of the text that make it clear that this report was written in part or in total by a computer. It also says that any reports written by AI must retain their first draft. That way, it should be easier for defense attorneys, judges, police supervisors, or any other auditing entity to see which portions of the final report were written by AI and which parts were written by the officer. Further, the bill requires officers to sign and verify that they read the report and its facts are correct. And it bans AI vendors from selling or sharing the information a police agency provided to the AI.

These common-sense, first-step reforms are important: watchdogs are struggling to figure out where and how AI is being used in a police context. In fact, Axon’s Draft One, would be out of compliance with this bill, which would require them to redesign their tool to make it more transparent—a small win for communities everywhere. 

So now we’re asking you: help us make a difference. Use EFF’s Action Center to tell Governor Newsom to sign S.B. 524 into law! 

TAKE ACTION

San Francisco Gets An Invasive Billionaire-Bought Surveillance HQ

10 September 2025 at 12:04

San Francisco billionaire Chris Larsen once again has wielded his wallet to keep city residents under the eye of all-seeing police surveillance. 

The San Francisco Police Commission, the Board of Supervisors, and Mayor Daniel Lurie have signed off on Larsen’s $9.4 million gift of a new Real-Time Investigations Center. The plan involves moving the city’s existing police tech hub from the public Hall of Justice not to the city’s brand-new police headquarters but instead to a sublet in the Financial District building of Ripple Labs, Larsen’s crypto-transfer company. Although the city reportedly won’t be paying for the space, the lease reportedly cost Ripple $2.3 million and will last until December 2026. 

The deal will also include a $7.25 million gift from the San Francisco Police Community Foundation that Larsen created. Police foundations are semi-public fundraising arms of police departments that allow them to buy technology and gear that the city will not give them money for.  

In Los Angeles, the city’s police foundation got $178,000 from the company Target to pay for the services of the data analytics company Palantir to use for predictive policing. In Atlanta, the city’s police foundation funds a massive surveillance apparatus as well as the much-maligned Cop City training complex. (Despite police foundations’ insistence that they are not public entities and therefore do not need to be transparent or answer public records requests, a judge recently ordered the Atlanta Police Foundation to release documentation related to Cop City.) 

A police foundation in San Francisco brings the same concerns: that an unaccountable and untransparent fundraising arm shmoozing with corporations and billionaires would fund unpopular surveillance measures without having to reveal much to the public.  

Larsen was one of the deep pockets behind last year’s Proposition E, a ballot measure to supercharge surveillance in the city. The measure usurped the city’s 2019 surveillance transparency and accountability ordinance, which had required the SFPD to get the elected Board of Supervisors’ approval before buying and using new surveillance technology. This common-sense democratic hurdle was, apparently, a bridge too far for the SFPD and for Larsen.  

We’re no fans of real-time crime centers (RTCCs), as they’re often called elsewhere, to start with. They’re basically control rooms that pull together all feeds from a vast warrantless digital dragnet, often including automated license plate readers, fixed cameras, officers’ body-worn cameras, drones, and other sources. It’s a means of consolidating constant surveillance of the entire population, tracking everyone wherever they go and whatever they do – worrisome at any time, but especially in a time of rising authoritarianism.  

Think of what this data could do if it got into federal hands; imagine how vulnerable city residents would be subject to harassment if every move they made was centralized and recorded downtown. But you don’t have to imagine, because SFPD already has been caught sharing automated license plate reader data with out-of-state law enforcement agencies assisting in federal immigration investigations. 

We’re especially opposed to RTCCs using live feeds from non-city surveillance cameras to push that panopticon’s boundaries even wider, as San Francisco’s does. Those semi-private networks of some 15,000 cameras, already abused by SFPD to surveil lawful protests against police violence, were funded in part by – you guessed it – Chris Larsen. 

These technologies could potentially endanger San Franciscans by directing armed police at them due to reliance on a faulty algorithm or by putting already-marginalized communities at further risk of overpolicing and surveillance. But studies find that these technologies just don’t work. If the goal is to stop crime before it happens, to spare someone the hardship and the trauma of getting robbed or hurt, cameras clearly do not accomplish this. There’s plenty of footage of crime occurring that belies the idea that surveillance is an effective deterrent, and although police often look to technology as a silver bullet to fight crime, evidence suggests that it does little to alter the historic ebbs and flows of criminal activity. 

Yet now this unelected billionaire – who already helped gut police accountability and transparency rules and helped fund sketchy surveillance of people exercising their First Amendment rights – wants to bankroll, expand, and host the police’s tech nerve center. 

Policing must be a public function so that residents can control - and demand accountability and transparency from - those who serve and protect but also surveil and track us all. Being financially beholden to private interests erodes the community’s trust and control and can leave the public high and dry if a billionaire’s whims change or conflict with the will of the people. Chris Larsen could have tried to address the root causes of crime that affect our community; instead, he exercises his bank account's muscle to decide that surveillance is best for San Franciscans with less in their wallets. 

Elected officials should have said “thanks but no thanks” to Larsen and ensured that the San Francisco Police Department remained under the complete control and financial auspices of nobody except the people of San Francisco. Rich people should not be allowed to fund the further degradation of our privacy as we go about our lives in our city’s public places. Residents should carefully watch what comes next to decide for themselves whether a false sense of security is worth living under constant, all-seeing, billionaire-bankrolled surveillance. 

You Went to a Drag Show—Now the State of Florida Wants Your Name

28 July 2025 at 14:59

If you thought going to a Pride event or drag show was just another night out, think again. If you were in Florida, it might land your name in a government database.

That’s what’s happening in Vero Beach, FL, where the Florida Attorney General’s office has subpoenaed a local restaurant, The Kilted Mermaid, demanding surveillance video, guest lists, reservation logs, and contracts of performers and other staff—all because the venue hosted an LGBTQ+ Pride event.

To be clear: no one has been charged with a crime, and the law Florida is likely leaning on here—the so-called “Protection of Children Act” (which was designed to be a drag show ban)—has already been blocked by federal courts as likely unconstitutional. But that didn’t stop Attorney General James Uthmeier from pushing forward anyway. Without naming a specific law that was violated, the AG’s press release used pointed and accusatory language, stating that "In Florida, we don't sacrifice the innocence of children for the perversions of some demented adults.” His office is now fishing for personal data about everyone who attended or performed at the event. This should set off every civil liberties alarm bell we have.

Just like the Kids Online Safety Act (KOSA) and other bills with misleading names, this isn’t about protecting children. It’s about using the power of the state to intimidate people government officials disagree with, and to censor speech that is both lawful and fundamental to American democracy.

Drag shows—many of which are family-friendly and feature no sexual content—have become a political scapegoat. And while that rhetoric might resonate in some media environments, the real-world consequences are much darker: state surveillance of private citizens doing nothing but attending a fun community celebration. By demanding video surveillance, guest lists, and reservation logs, the state isn’t investigating a crime, it is trying to scare individuals from attending a legal gathering. These are people who showed up at a public venue for a legal event, while a law restricting it was not even in effect. 

The Supreme Court has ruled multiple times that subpoenas forcing disclosure of members of peaceful organizations have a chilling effect on free expression. Whether it’s a civil rights protest, a church service, or, yes, a drag show: the First Amendment protects the confidentiality of lists of attendees.

Even if the courts strike down this subpoena—and they should—the damage will already be done. A restaurant owner (who also happens to be the town’s vice mayor) is being dragged into a state investigation. Performers’ identities are potentially being exposed—whether to state surveillance, inclusion in law enforcement databases, or future targeting by anti-LGBTQ+ groups. Guests who thought they were attending a fun community event are now caught up in a legal probe. These are the kinds of chilling, damaging consequences that will discourage Floridians from hosting or attending drag shows, and could stamp out the art form entirely. 

EFF has long warned about this kind of mission creep: where a law or policy supposedly aimed at public safety is turned into a tool for political retaliation or mass surveillance. Going to a drag show should not mean you forfeit your anonymity. It should not open you up to surveillance. And it absolutely should not land your name in a government database.

You Shouldn’t Have to Make Your Social Media Public to Get a Visa

23 July 2025 at 18:33

The Trump administration is continuing its dangerous push to surveil and suppress foreign students’ social media activity. The State Department recently announced an unprecedented new requirement that applicants for student and exchange visas must set all social media accounts to “public” for government review. The State Department also indicated that if applicants refuse to unlock their accounts or otherwise don’t maintain a social media presence, the government may interpret it as an attempt to evade the requirement or deliberately hide online activity.

The administration is penalizing prospective students and visitors for shielding their social media accounts from the general public or for choosing to not be active on social media. This is an outrageous violation of privacy, one that completely disregards the legitimate and often critical reasons why millions of people choose to lock down their social media profiles, share only limited information about themselves online, or not engage in social media at all. By making students abandon basic privacy hygiene as the price of admission to American universities, the administration is forcing applicants to expose a wealth of personal information to not only the U.S. government, but to anyone with an internet connection.

Why Social Media Privacy Matters

The administration’s new policy is a dangerous expansion of existing social media collection efforts. While the State Department has required since 2019 that visa applicants disclose their social media handles—a policy EFF has consistently opposed—forcing applicants to make their accounts public crosses a new line.

Individuals have significant privacy interests in their social media accounts. Social media profiles contain some of the most intimate details of our lives, such as our political views, religious beliefs, health information, likes and dislikes, and the people with whom we associate. Such personal details can be gleaned from vast volumes of data given the unlimited storage capacity of cloud-based social media platforms. As the Supreme Court has recognized, “[t]he sum of an individual’s private life can be reconstructed through a thousand photographs labeled with dates, locations, and descriptions”—all of which and more are available on social media platforms.

By requiring visa applicants to share these details, the government can obtain information that would otherwise be inaccessible or difficult to piece together across disparate locations. For example, while visa applicants are not required to disclose their political views in their applications, applicants might choose to post their beliefs on their social media profiles.

This information, once disclosed, doesn’t just disappear. Existing policy allows the government to continue surveilling applicants’ social media profiles even once the application process is over. And personal information obtained from applicants’ profiles can be collected and stored in government databases for decades.

What’s more, by requiring visa applicants to make their private social media accounts public, the administration is forcing them to expose troves of personal, sensitive information to the entire internet, not just the U.S. government. This could include various bad actors like identity thieves and fraudsters, foreign governments, current and prospective employers, and other third parties.

Those in applicants’ social media networks—including U.S. citizen family or friends—can also become surveillance targets by association. Visa applicants’ online activity is likely to reveal information about the users with whom they’re connected. For example, a visa applicant could tag another user in a political rant or posts photos of themselves and the other user at a political rally. Anyone who sees those posts might reasonably infer that the other user shares the applicant’s political beliefs. The administration’s new requirement will therefore publicly expose the personal information of millions of additional people, beyond just visa applicants.

There are Very Good Reasons to Keep Social Media Accounts Private

An overwhelming number of social media users maintain private accounts for the same reason we put curtains on our windows: a desire for basic privacy. There are numerous legitimate reasons people choose to share their social media only with trusted family and friends, whether that’s ensuring personal safety, maintaining professional boundaries, or simply not wanting to share personal profiles with the entire world.

Safety from Online Harassment and Physical Violence

Many people keep their accounts private to protect themselves from stalkers, harassers, and those who wish them harm. Domestic violence survivors, for example, use privacy settings to hide from their abusers, and organizations supporting survivors often encourage them to maintain a limited online presence.

Women also face a variety of gender-based online harms made worse by public profiles, including stalking, sexual harassment, and violent threats. A 2021 study reported that at least 38% of women globally had personally experienced online abuse, and at least 85% of women had witnessed it. Women are, in turn, more likely to activate privacy settings than men.

LGBTQ+ individuals similarly have good reasons to lock down their accounts. Individuals from countries where their identity puts them in danger rely on privacy protections to stay safe from state action. People may also reasonably choose to lock their accounts to avoid the barrage of anti-LGBTQ+ hate and harassment that is common on social media platforms, which can lead to real-world violence. Others, including LGBTQ+ youth, may simply not be ready to share their identity outside of their chosen personal network.

Political Dissidents, Activists, and Journalists

Activists working on sensitive human rights issues, political dissidents, and journalists use privacy settings to protect themselves from doxxing, harassment, and potential political persecution by their governments.

Rather than protecting these vulnerable groups, the administration’s policy instead explicitly targets political speech. The State Department has given embassies and consulates a vague directive to vet applicants’ social media for “hostile attitudes towards our citizens, culture, government, institutions, or founding principles,” according to an internal State Department cable obtained by multiple news outlets. This includes looking for “applicants who demonstrate a history of political activism.” The cable did not specify what, exactly, constitutes “hostile attitudes.”

Professional and Personal Boundaries

People use privacy settings to maintain boundaries between their personal and professional lives. They share family photos, sensitive updates, and personal moments with close friends—not with their employers, teachers, professional connections, or the general public.

The Growing Menace of Social Media Surveillance

This new policy is an escalation of the Trump administration’s ongoing immigration-related social media surveillance. EFF has written about the administration’s new “Catch and Revoke” effort, which deploys artificial intelligence and other data analytic tools to review the public social media accounts of student visa holders in an effort to revoke their visas. And EFF recently submitted comments opposing a USCIS proposal to collect social media identifiers from visa and green card holders already living in the U.S., including when they submit applications for permanent residency and naturalization.

The administration has also started screening many non-citizens' social media accounts for ambiguously-defined “antisemitic activity,” and previously announced expanded social media vetting for any visa applicant seeking to travel specifically to Harvard University for any purpose.

The administration claims this mass surveillance will make America safer, but there’s little evidence to support this. By the government’s own previous assessments, social media surveillance has not proven effective at identifying security threats.

At the same time, these policies gravely undermine freedom of speech, as we recently argued in our USCIS comments. The government is using social media monitoring to directly target and punish through visa denials or revocations foreign students and others for their digital speech. And the social media surveillance itself broadly chills free expression online—for citizens and non-citizens alike.

In defending the new requirement, the State Department argued that a U.S. visa is a “privilege, not a right.” But privacy and free expression should not be privileges. These are fundamental human rights, and they are rights we abandon at our peril.

When Your Power Meter Becomes a Tool of Mass Surveillance

21 July 2025 at 11:57

Simply using extra electricity to power some Christmas lights or a big fish tank shouldn’t bring the police to your door. In fact, in California, the law explicitly protects the privacy of power customers, prohibiting public utilities from disclosing precise “smart” meter data in most cases. 

Despite this, Sacramento’s power company and law enforcement agencies have been running an illegal mass surveillance scheme for years, using our power meters as home-mounted spies. The Electronic Frontier Foundation (EFF) is seeking to end Sacramento’s dragnet surveillance of energy customers and have asked for a court order to stop this practice for good.

For a decade, the Sacramento Municipal Utilities District (SMUD) has been searching through all of its customers’ energy data, and passed on more than 33,000 tips about supposedly “high” usage households to police. Ostensibly looking for homes that were growing illegal amounts of cannabis, SMUD analysts have admitted that such “high” power usage could come from houses using air conditioning or heat pumps or just being large. And the threshold of so-called “suspicion” has steadily dropped, from 7,000 kWh per month in 2014 to just 2,800 kWh a month in 2023. One SMUD analyst admitted that they themselves “used 3500 [kWh] last month.”

This scheme has targeted Asian customers. SMUD analysts deemed one home suspicious because it was “4k [kWh], Asian,” and another suspicious because “multiple Asians have reported there.” Sacramento police sent accusatory letters in English and Chinese, but no other language, to residents who used above-average amounts of electricity.

In 2022, EFF and the law firm Vallejo, Antolin, Agarwal, Kanter LLP sued SMUD and the City of Sacramento, representing the Asian American Liberation Network and two Sacramento County residents. One is an immigrant from Vietnam. Sheriff’s deputies showed up unannounced at his home, falsely accused him of growing cannabis based on an erroneous SMUD tip, demanded entry for a search, and threatened him with arrest when he refused. He has never grown cannabis; rather, he consumes more than average electricity due to a spinal injury.

Last week, we filed our main brief explaining how this surveillance program violates the law and why it must be stopped. California’s state constitution bars unreasonable searches. This type of dragnet surveillance — suspicionless searches of entire zip codes worth of customer energy data — is inherently unreasonable. Additionally, a state statute generally prohibits public utilities from sharing such data. As we write in our brief, the Sacramento’s mass surveillance scheme does not qualify for one of the narrow exceptions to this rule. 

Mass surveillance violates the privacy of many individuals, as police without individualized suspicion seek (possibly non-existent) evidence of some kind of offense by some unknown person. As we’ve seen time and time again, innocent people inevitably get caught in the dragnet. For decades, EFF has been exposing and fighting these kinds of dangerous schemes. We remain committed to protecting digital privacy, whether it’s being threatened by national governments – or your local power company.

Amazon Ring Cashes in on Techno-Authoritarianism and Mass Surveillance

18 July 2025 at 10:37

Ring founder Jamie Siminoff is back at the helm of the surveillance doorbell company, and with him is the surveillance-first-privacy-last approach that made Ring one of the most maligned tech devices. Not only is the company reintroducing new versions of old features which would allow police to request footage directly from Ring users, it is also introducing a new feature that would allow police to request live-stream access to people’s home security devices. 

This is a bad, bad step for Ring and the broader public. 

Ring is rolling back many of the reforms it’s made in the last few years by easing police access to footage from millions of homes in the United States. This is a grave threat to civil liberties in the United States. After all, police have used Ring footage to spy on protestors, and obtained footage without a warrant or consent of the user. It is easy to imagine that law enforcement officials will use their renewed access to Ring information to find people who have had abortions or track down people for immigration enforcement

Siminoff has announced in a memo seen by Business Insider that the company will now be reimagined from the ground up to be “AI first”—whatever that means for a home security camera that lets you see who is ringing your doorbell. We fear that this may signal the introduction of video analytics or face recognition to an already problematic surveillance device. 

It was also reported that employees at Ring will have to show proof that they use AI in order to get promoted. 

Not to be undone with new bad features, they are also planning on rolling back some of the necessary reforms Ring has made: namely partnering with Axon to build a new tool that would allow police to request Ring footage directly from users, and also allow users to consent to letting police livestream directly from their device. 

After years of serving as the eyes and ears of police, the company was compelled by public pressure to make a number of necessary changes. They introduced end-to-end encryption, they ended their formal partnerships with police which were an ethical minefield, and they ended their tool that facilitated police requests for footage directly to customers. Now they are pivoting back to being a tool of mass surveillance. 

Why now? It is hard to believe the company is betraying the trust of its millions of customers in the name of “safety” when violent crime in the United States is reaching near-historically low levels. It’s probably not about their customers—the FTC had to compel Ring to take its users’ privacy seriously. 

No, this is most likely about Ring cashing in on the rising tide of techno-authoritarianism, that is, authoritarianism aided by surveillance tech. Too many tech companies want to profit from our shrinking liberties. Google likewise recently ended an old ethical commitment that prohibited it from profiting off of surveillance and warfare. Companies are locking down billion-dollar contracts by selling their products to the defense sector or police.

Shame on Ring.

Flock Safety’s Feature Updates Cannot Make Automated License Plate Readers Safe

27 June 2025 at 20:36

Two recent statements from the surveillance company—one addressing Illinois privacy violations and another defending the company's national surveillance network—reveal a troubling pattern: when confronted by evidence of widespread abuse, Flock Safety has blamed users, downplayed harms, and doubled down on the very systems that enabled the violations in the first place.

Flock's aggressive public relations campaign to salvage its reputation comes as no surprise. Last month, we described how investigative reporting from 404 Media revealed that a sheriff's office in Texas searched data from more than 83,000 automated license plate reader (ALPR) cameras to track down a woman suspected of self-managing an abortion. (A scenario that may have been avoided, it's worth noting, had Flock taken action when they were first warned about this threat three years ago).

Flock calls the reporting on the Texas sheriff's office "purposefully misleading," claiming the woman was searched for as a missing person at her family's request rather than for her abortion. But that ignores the core issue: this officer used a nationwide surveillance dragnet (again: over 83,000 cameras) to track someone down, and used her suspected healthcare decisions as a reason to do so. Framing this as concern for her safety plays directly into anti-abortion narratives that depict abortion as dangerous and traumatic in order to justify increased policing, criminalization, control—and, ultimately, surveillance.

Flock Safety has blamed users, downplayed harms, and doubled down on the very systems that enabled the violations in the first place.

As if that weren't enough, the company has also come under fire for how its ALPR network data is being actively used to assist in mass deportation. Despite U.S. Immigration and Customs Enforcement (ICE) having no formal agreement with Flock Safety, public records revealed "more than 4,000 nation and statewide lookups by local and state police done either at the behest of the federal government or as an 'informal' favor to federal law enforcement, or with a potential immigration focus." The network audit data analyzed by 404 exposed an informal data-sharing environment that creates an end-run around oversight and accountability measures: federal agencies can access the surveillance network through local partnerships without the transparency and legal constraints that would apply to direct federal contracts.

Flock Safety is adamant this is "not Flock's decision," and by implication, not their fault. Instead, the responsibility lies with each individual local law enforcement agency. In the same breath, they insist that data sharing is essential, loudly claiming credit when the technology is involved in cross-jurisdictional investigations—but failing to show the same attitude when that data-sharing ecosystem is used to terrorize abortion seekers or immigrants. 

Flock Safety: The Surveillance Social Network

In growing from a 2017 startup to a $7.5 billion company "serving over 5,000 communities," Flock allowed individual agencies wide berth to set and regulate their own policies. In effect, this approach offered cheap surveillance technology with minimal restrictions, leaving major decisions and actions in the hands of law enforcement while the company scaled rapidly.

And they have no intention of slowing down. Just this week, Flock launched its Business Network, facilitating unregulated data sharing amongst its private sector security clients. "For years, our law enforcement customers have used the power of a shared network to identify threats, connect cases, and reduce crime. Now, we're extending that same network effect to the private sector," Flock Safety's CEO announced

A crowd around the Flock Safety set-up at a police conference.

Flock Safety wooing law enforcement officers at the 2023 International Chiefs of Police Conference.

The company is building out a new mass surveillance network using the exact template that ended with the company having to retrain thousands of officers in Illinois on how not to break state law—the same template that made it easy for officers to do so in the first place. Flock's continued integration of disparate surveillance networks across the public and private spheres—despite the harms that have already occurred—is owed in part to the one thing that it's gotten really good at over the past couple of years: facilitating a surveillance social network. 

Employing marketing phrases like "collaboration" and "force multiplier," Flock encourages as much sharing as possible, going as far as to claim that network effects can significantly improve case closure rates. They cultivate a sense of shared community and purpose among users so they opt into good faith sharing relationships with other law enforcement agencies across the country. But it's precisely that social layer that creates uncontrollable risk.

The possibility of human workarounds at every level undermines any technical safeguards Flock may claim. Search term blocking relies on officers accurately labeling search intent—a system easily defeated by entering vague reasons like "investigation" or incorrect justifications, made either intentionally or not. And, of course, words like "investigation" or "missing person" can mean virtually anything, offering no value to meaningful oversight of how and for what the system is being used. Moving forward, sheriff's offices looking to avoid negative press can surveil abortion seekers or immigrants with ease, so long as they use vague and unsuspecting reasons. 

The same can be said for case number requirements, which depend on manual entry. This can easily be circumvented by reusing legitimate case numbers for unauthorized searches. Audit logs only track inputs, not contextual legitimacy. Flock's proposed AI-driven audit alerts, something that may be able to flag suspicious activity after searches (and harm) have already occurred, relies on local agencies to self-monitor misuse—despite their demonstrated inability to do so.

Flock operates as a single point of failure that can compromise—and has compromised—the privacy of millions of Americans simultaneously.

And, of course, even the most restrictive department policy may not be enough. Austin, Texas, had implemented one of the most restrictive ALPR programs in the country, and the program still failed: the city's own audit revealed systematic compliance failures that rendered its guardrails meaningless. The company's continued appeal to "local policies" means nothing when Flock's data-sharing network does not account for how law enforcement policies, regulations, and accountability vary by jurisdiction. You may have a good relationship with your local police, who solicit your input on what their policy looks like; you don't have that same relationship with hundreds or thousands of other agencies with whom they share their data. So if an officer on the other side of the country violates your privacy, it’d be difficult to hold them accountable. 

ALPR surveillance systems are inherently vulnerable to both technical exploitation and human manipulation. These vulnerabilities are not theoretical—they represent real pathways for bad actors to access vast databases containing millions of Americans' location data. When surveillance databases are breached, the consequences extend far beyond typical data theft—this information can be used to harass, stalk, or even extort. The intimate details of people's daily routines, their associations, and their political activities may become available to anyone with malicious intent. Flock operates as a single point of failure that can compromise—and has compromised—the privacy of millions of Americans simultaneously.

Don't Stop de-Flocking

Rather than addressing legitimate concerns about privacy, security, and constitutional rights, Flock has only promised updates that fall short of meaningful reforms. These software tweaks and feature rollouts cannot assuage the fear engendered by the massive surveillance system it has built and continues to expand.

A close-up of a Flock Safety camera on a pole

A typical specimen of Flock Safety's automated license plate readers.

Flock's insistence that what's happening with abortion criminalization and immigration enforcement has nothing to do with them—that these are just red-state problems or the fault of rogue officers—is concerning. Flock designed the network that is being used, and the public should hold them accountable for failing to build in protections from abuse that cannot be easily circumvented.

Thankfully, that's exactly what's happening: cities like Austin, San MarcosDenver, Norfolk, and San Diego are pushing back. And it's not nearly as hard a choice as Flock would have you believe: Austinites are weighing the benefits of a surveillance system that generates a hit less than 0.02% of the time against the possibility that scanning 75 million license plates will result in an abortion seeker being tracked down by police, or an immigrant being flagged by ICE in a so-called "sanctuary city." These are not hypothetical risks. It is already happening.

Given how pervasive, sprawling, and ungovernable ALPR sharing networks have become, the only feature update we can truly rely on to protect people's rights and safety is no network at all. And we applaud the communities taking decisive action to dismantle its surveillance infrastructure.

Follow their lead: don't stop de-flocking.

Georgia Court Rules for Transparency over Private Police Foundation

27 June 2025 at 11:51

A Georgia court has decided that private non-profit Atlanta Police Foundation (APF) must comply with public records requests under the Georgia Open Records Act for some of its functions on behalf of the Atlanta Police Department. This is a major win for transparency in the state. 

 The lawsuit was brought last year by the Atlanta Community Press Collective (ACPC) and Electronic Frontier Alliance member Lucy Parsons Labs (LPL). It concerns the APF’s refusal to disclose records about its role as the leaser and manager of the site of so-called Cop City, the Atlanta Public Safety Training Center at the heart of a years-long battle that pitted local social and environmental movements against the APF. We’ve previously written about how APF and similar groups fund police surveillance technology, and how the Atlanta Police Department spied on the social media of activists opposed to Cop City.  

This is a big win for transparency and for local communities who want to maintain their right to know what public agencies are doing. 

Police Foundations often provide resources to police departments that help them avoid public oversight, and the Atlanta Police Foundation leads the way with its maintenance of the Loudermilk Video Integration Center and its role in Cop City, which will be used by public agencies including the Atlanta and other police departments. 

ACPC and LPL were represented by attorneys Joy Ramsingh, Luke Andrews, and Samantha Hamilton who had won the release of some materials this past December. The plaintiffs had earlier been represented by the University of Georgia School of Law First Amendment Clinic.  

The win comes at just the right time. Last Summer, the Georgia Supreme Court ruled that private contractors working for public entities are subject to open records laws. The Georgia state legislature then passed a bill to make it harder to file public records requests against private entities. With this month’s ruling, there is still time for the Atlanta Police Foundation to appeal the decision, but failing that, they will have to begin to comply with public records requests by the beginning of July.  

We hope that this will help ensure transparency and accountability when government agencies farm out public functions to private entities, so that local activists and journalists will be able to uncover materials that should be available to the general public. 

❌