Normal view

There are new articles available, click to refresh the page.
Before yesterdayAmerican Civil Liberties Union

Dozens of Police Agencies in California Are Still Sharing Driver Locations with Anti-Abortion States. We're Fighting Back.

pOver the last decade, California has built up some of the nation’s strongest driver privacy protections, thanks to the hard work of activists, civil rights groups, and elected leaders./p pOne law in particular, often called a href=https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200SB34SB 34/a, prohibits police from circulating detailed maps of people’s driving patterns with the federal government and agencies in other states– a protection that has only grown more important with the end of iRoe v. Wade/i and the subsequent surge in abortion criminalization./p pBut dozens of California police departments have decided to defy the law, even after receiving a href=https://oag.ca.gov/system/files/media/2023-dle-06.pdfclear guidance/a from California Attorney General Rob Bonta, the chief law enforcement officer in the state. Last month the ACLU of Northern California and our partners a href=https://www.aclunc.org/sites/default/files/2024-01-31_letter_to_ag_bonta_re_sb_34_final.pdfsent Attorney General Bonta a letter/a listing 35 police agencies that have refused to comply with the law and protect driver privacy./p pWe should all be able to drive to a doctor’s office, place of worship, or political rally without being tracked and cataloged by police agencies. But for years now, police have used automated license plate readers (ALPRs) to record and track the movements of drivers on a previously unseen scale. These a href=https://www.aclu.org/documents/you-are-being-tracked-how-license-plate-readers-are-being-used-record-americans-movementssystems/a allow police to collect and store information about drivers whose cars pass through ALPR cameras’ fields of view, which, along with the date and time of capture, can reveal sensitive details about our movements and, as a result, our private lives./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/documents/you-are-being-tracked-how-license-plate-readers-are-being-used-record-americans-movements target=_blank tabindex=-1 img width=1120 height=788 src=https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM.png class=attachment-4x3_full size-4x3_full alt=A highway with fast moving cars. decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM.png 1120w, https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM-768x540.png 768w, https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM-400x281.png 400w, https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM-600x422.png 600w, https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM-800x563.png 800w, https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM-1000x704.png 1000w sizes=(max-width: 1120px) 100vw, 1120px / /a /div div class=wp-link__title a href=https://www.aclu.org/documents/you-are-being-tracked-how-license-plate-readers-are-being-used-record-americans-movements target=_blank You Are Being Tracked: How License Plate Readers Are Being Used to Record Americans' Movements /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/documents/you-are-being-tracked-how-license-plate-readers-are-being-used-record-americans-movements target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pThe ACLU has long seen the danger ALPR surveillance poses, and working alongside communities on the ground, has fought to bolster California’s legal protections for driver privacy. For over a decade, we have conducted investigations, advocacy, and litigation focused on how police agencies use ALPR to track law-abiding drivers, amass hordes of sensitive information, and use it to harm people./p pIn the wake of a href=http://chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://www.aclu.org/files/assets/071613-aclu-alprreport-opt-v05.pdfACLU’s groundbreaking report/a on ALPR across the US, a href=https://www.aclunc.org/blog/use-automated-license-plate-readers-expanding-northern-california-and-data-shared-fedswe called out/a police use of ALPRs in 2013 as a threat to driver privacy and warned that California lacked statewide driver privacy protections. In 2016, thanks in part to the advocacy of the ACLU and a href=https://www.eff.org/deeplinks/2015/10/success-sacramento-four-new-laws-one-veto-all-victories-privacy-and-transparencyallies/a, the California legislature passed a href=https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200SB34SB 34/a, the law at issue today. In a href=https://www.aclu.org/news/immigrants-rights/documents-reveal-ice-using-driver-location-data2019/a we discovered Immigration and Customs Enforcement’s (ICE) exploitation of ALPR-collected information to track and target immigrants in California and across the United States./p pFrom there, we took action to enforce California’s driver privacy protections. In a href=https://www.aclunc.org/news/california-activists-sue-marin-county-sheriff-illegally-sharing-drivers-license-plate-data-ice2021/a we sued Marin County, California for illegally sharing millions of local drivers’ license plates and locations with federal and out-of-state agencies, including ICE. The sheriff eventually agreed to comply with SB 34 as part of a a href=https://www.aclunc.org/our-work/legal-docket/lagleva-v-doyle-license-plate-surveillance#:~:text=In%20May%202022%2C%20the%20plaintiffs,54.settlement agreement/a, but we believed that many other California police agencies were still violating SB 34./p pWe rang the alarm again in the wake of the iDobbs /idecision overturning iRoe v. Wade./i Alongside our partners at the Electronic Frontier Foundation and ACLU of Southern California, we a href=https://www.aclunc.org/news/civil-liberties-groups-demand-california-police-stop-sharing-drivers-location-data-police-antisent letters to over 70 law enforcement agencies in California/a demanding they stop sharing people’s driving patterns with states that have criminalized abortion care. We also notified the attorney general’s office of these violations./p pFollowing our letters, the attorney general issued a href=https://oag.ca.gov/system/files/media/2023-dle-06.pdfinstructions/a to police across the state to follow SB 34’s plain text and cease sharing license plate information with state and federal agencies outside California. While some agencies have come into compliance, many police are digging in and refusing to follow the law. Police lobbyists have even a href=https://www.eff.org/files/2024/01/23/bulletin_reponse_letter.03_jrt_final.khb_.02.pdfasked/a the attorney general to withdraw his interpretation of the law./p pSimply put, the position touted by police agencies and their lobbyists puts Californians at risk. SB 34 is important because when police track and share the locations of law-abiding drivers, that information can easily be used to facilitate racist policing, a href=https://www.buzzfeednews.com/article/alexcampbell/the-ticket-machinepunitive fees/a, and the a href=https://www.ap.org/ap-in-the-news/2012/with-cameras-informants-nypd-eyed-mosquesdiscriminatory targeting/a of people in California and beyond. And, as a href=https://www.eff.org/files/2023/05/24/tracy.pdfour letters warned/a, when California shares ALPR information with authorities in states with anti-abortion or anti-trans laws, police and prosecutors gain new power to track and prosecute people who traveled to California to receive reproductive or gender-affirming care./p pWe should all be able to travel safely on the state’s roads without our movements being handed to authorities outside the state. That is why we have continued to push California police agencies to follow California’s driver privacy law. And it’s why we have supported localities a href=https://www.aclunc.org/blog/alameda-rejects-surveillance-deal-company-tied-icethat reject/a ALPR programs at odds with their values./p pIt is unacceptable that police agencies charged with enforcing laws are refusing to comply with this one. While we are pleased with Attorney General Bonta’s strong statement on SB 34, we urge the attorney general to use all available means at his disposal to ensure compliance. And rest assured, that the ACLU will continue fighting to enact and enforce protections that keep all of us safe, no matter where we go in the state./p piThis article was a href=https://www.aclunc.org/blog/californians-fought-hard-driver-privacy-protections-why-are-police-refusing-follow-themoriginally featured/a on the blog of the ACLU of Northern California./i/p div class=rss-cta__titleWe need you with us to keep fighting/diva href=https://action.aclu.org/give/now class=rss-cta__buttonDonate today/a/div

When it Comes to Facial Recognition, There is No Such Thing as a Magic Number

pWe often hear about government misuse of face recognition technology (FRT) and how it can a href=https://www.wired.com/story/wrongful-arrests-ai-derailed-3-mens-lives/derail/a a person’s life through wrongful arrests and other harms. Despite mounting evidence, government agencies continue to push face recognition systems on communities across the United States. Key to this effort are the corporate makers and sellers who market this technology as reliable, accurate, and safe – often by pointing to their products’ scores on government-run performance tests./p pAll of this might tempt policymakers to believe that the safety and civil rights problems of facial recognition can be solved by mandating a certain performance score or grade. However, relying solely on test scores risks obscuring deeper problems with face recognition while overstating its effectiveness and real-life safety./p div class=wp-heading mb-8 h2 id= class=wp-heading-h2 with-standardHow are facial recognition systems tested? /h2 /div pMany facial recognition systems are tested by the federal National Institute of Standards and Technology (NIST). In one of their tests, NIST uses companies’ algorithms to try and search for a face within a large “matching database” of faces. In broad strokes, this test appears to resemble how police use face recognition today, feeding an image of a single unknown person’s face into an algorithm that compares it against a large database of mugshot or driver’s license photos and generates suggested images, paired with numbers that represent estimates of how similar the images are./p pThese and other tests can reveal disturbing racial disparities. In their own a href=http://gendershades.org/overview.htmlgroundbreaking research/a, computer scientists Dr. Joy Buolamwini and Dr. Timnit Gebru tested several prominent gender classification algorithms, and a href=https://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdffound that/a those systems were less likely to accurately classify the faces of women with darker complexions. Following that, the ACLU of Northern California performed its own test of Amazon’s facial recognition software, which a href=https://www.aclu.org/news/privacy-technology/amazons-face-recognition-falsely-matched-28falsely matched/a the faces of 28 members of Congress with faces in a mugshot database, with Congressmembers of color being misidentified at higher rates. Since then, additional testing by a href=https://pages.nist.gov/frvt/reports/demographics/nistir_8429.pdfNIST/a and a href=https://openaccess.thecvf.com/content/WACV2023W/DVPBA/papers/Bhatta_The_Gender_Gap_in_Face_Recognition_Accuracy_Is_a_Hairy_WACVW_2023_paper.pdfacademic researchers/a indicates that these problems persist./p pWhile testing of facial recognition for accuracy and fairness across race, sex, and other characteristics is critical, the tests do not take full account of practical realities. There is no laboratory test that represents the conditions and reality of a href=https://www.aclu.org/cases/parks-v-mccormac?document=Amicus-Briefhow police use face recognition/a in real world-scenarios. For one, testing labs are not going to have access to the exact “matching database,” the particular digital library of faces on mugshots, licenses, and surveillance photos, that police in a specific community search through when they operate face recognition. And tests cannot account for the full range of low-quality images from surveillance cameras (a href=https://www.wired.com/story/parabon-nanolabs-dna-face-models-police-facial-recognition/and truly dubious sources/a) that police feed into these systems, or the trouble police have when visually reviewing and choosing from a set of possible matches produced by the technology./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/cases/parks-v-mccormac target=_blank tabindex=-1 img width=700 height=350 src=https://www.aclu.org/wp-content/uploads/2023/06/01b485e2a16c02cde9cc7378926d513c.jpg class=attachment-original size-original alt= decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2023/06/01b485e2a16c02cde9cc7378926d513c.jpg 700w, https://www.aclu.org/wp-content/uploads/2023/06/01b485e2a16c02cde9cc7378926d513c-400x200.jpg 400w, https://www.aclu.org/wp-content/uploads/2023/06/01b485e2a16c02cde9cc7378926d513c-600x300.jpg 600w sizes=(max-width: 700px) 100vw, 700px / /a /div div class=wp-link__title a href=https://www.aclu.org/cases/parks-v-mccormac target=_blank Parks v. McCormac /a /div div class=wp-link__description a href=https://www.aclu.org/cases/parks-v-mccormac target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletOn January 29, 2024, the ACLU and the ACLU of New Jersey filed an amicus brief in the U.S. District Court for the District of New Jersey in support of/p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/cases/parks-v-mccormac target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pIn response to these real concerns, vendors routinely hold up their performance on tests in their a href=https://www.clearview.ai/post/debunking-the-three-biggest-myths-about-clearview-aimarketing to government agencies/a as evidence of facial recognition’s reliability and accuracy. Lawmakers have also a href=https://legiscan.com/CA/text/AB642/id/2796168sought to legislate performance scores/a that set across-the-board accuracy or error-rate requirements for facial recognition algorithms used by police that would allow police to use FRT systems that clear these requirements. This approach would be misguided./p div class=wp-heading mb-8 h2 id= class=wp-heading-h2 with-standardHow can performance scores be misleading? /h2 /div pIt is easy to be misled by performance scores. Imagine a requirement that police can only use systems that produce an overall true positive rate, a measure of how often the results returned by a FRT system include a match for the person depicted in the probe image when there is a matching image in the database, above 98 percent in testing. At first glance, that might sound like a pretty strong requirement — but a closer look reveals a very different story./p pFor one, police typically configure and customize facial recognition systems to return a list of multiple results, sometimes as many as hundreds of results. Think of this as a ‘digital lineup.’ In NIST testing, if at least one of the results returned is a match for the probe image, the search is considered successful and counted as part of the true positive rate metric. But even when this happens in practice — which certainly isn’t always the case — there is no guarantee that police will select the true match rather than one of the other results. True matches in testing might be crowded out by false matches in practice because of these police-created ‘digital lineups.’ This alone makes it difficult to choose one universal performance score that can be applied to many different FRT systems./p pLet’s look at another metric called the false positive rate, which assesses how often a FRT search will return results when there is no matching image in the database. Breaking results down by race, the same algorithm that produces the 98 percent true positive rate overall can also produce a false positive rate for Black men several times the false positive rate for white men — and an even higher false positive rate for Black women. This example is not merely a hypothetical: in a href=https://nvlpubs.nist.gov/nistpubs/ir/2019/nist.ir.8280.pdfNIST testing,/a many algorithms have exhibited this pattern. (1) a href=https://pages.nist.gov/frvt/html/frvt_demographics.htmlOther recent NIST testing/a also shows algorithms produced false positive rates tens or hundreds of times higher for females older than 65 born in West African countries than for males ages 20-35 born in Eastern European countries. (2)/p pBy only considering the true positive rate, we miss these extreme disparities, which can lead to devastating consequences. Across the United States, police are arresting people based on false matches and harming people like a href=https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.htmlNijeer Parks/a, a Black man who police in New Jersey falsely arrested and held in jail for ten days because police trusted the results of face recognition, overlooking obvious exonerating evidence. Human mis-reliance on face recognition is already a problem; focusing on performance scores might make things worse./p div class=wp-heading mb-8 h2 id= class=wp-heading-h2 with-standardWhat’s the takeaway for policymakers? /h2 /div pLawmakers should know that a facial recognition algorithm’s performance on a test cannot be easily or quickly generalized to make broad claims about whether a facial recognition algorithm is safe. Performance scores are not an easy fix to the harms that are resulting from the use of face recognition systems, and they of course don’t account for humans that will inevitably be in the loop./p pAs the ACLU explained in its recent a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13epublic comment/a to the Biden Administration, the problems of facial recognition run deep and beyond the software itself. Facial recognition is dangerous if it’s inaccurate — a problem that testing aims to address — but also dangerous even if it could hypothetically be perfectly accurate. In such a world, governments could use face surveillance to precisely track us as we leave home, attend a protest, or take public transit to the doctor’s office. This is why policymakers in an expanding list of U.S. cities and counties have decided to prohibit government use of face recognition. And it’s why a href=https://www.aclu.org/press-releases/aclu-calls-moratorium-law-and-immigration-enforcement-use-facial-recognitionACLU supports a federal moratorium/a on its use by law and immigration enforcement agencies./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 /a /div div class=wp-link__title a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank ACLU Comment re: Request for Comment on Law Enforcement Agencies' Use of Facial Recognition Technology, Other Technologies Using Biometric Information, and Predictive Algorithms (Exec. Order 14074, Section 13(e)) /a /div div class=wp-link__description a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tablet/p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pConversations about the shortcomings of performance scores are important, but instead of trying to find some magic number, policymakers should focus on how any use of facial recognition can expand discriminatory policing, massively expand the power of government, and create the conditions for authoritarian control of our private lives./p div class=wp-heading mb-8 h4 id= class=wp-heading-h4 with-standardEndnotes: /h4 /div div class=wp-heading mb-8 h4 id= class=wp-heading-h4 with-standard(1) For one demonstrative example, an FRT algorithm developed by the vendor NEC and submitted to NIST’s vendor testing program produced an overall true positive rate above 98% in some of the testing. See National Institute of Standards and Technology, Face Recognition Vendor Test Report Card for NEC-2 1, https://pages.nist.gov/frvt/reportcards/1N/nec_2.pdf (finding a false negative identification rate (FNIR) of less than .02—or 2%—for testing using multiple datasets. The true positive identification rate (TPIR) is one minus the FPIR). However, in other NIST testing, the same algorithm also produced false positive rates for Black men more than three times the false match rate for white men at various thresholds. See Patrick Grother et al., U.S. Dep’t of Com., Nat’l Inst. for Standards amp; Tech., Face Recognition Vendor Test Part 3: Demographic Effects Annex 16 at 34 fig.32, (Dec. 2019), https://pages.nist.gov/frvt/reports/demographics/annexes/annex_16.pdf. /h4 /div div class=wp-heading mb-8 h4 id= class=wp-heading-h4 with-standard(2) See National Institute of Standards and Technology, Face Recognition Technology Evaluation: Demographic Effects in Face Recognition, FRTE 1:1 Demographic Differentials Summary, False Positive Differentials, https://pages.nist.gov/frvt/html/frvt_demographics.html (Last visited February 6, 2024). The table summarizes demographic differentials in false match rates for various 1:1 algorithms and highlights that many algorithms exhibit false match rates differentials for images of people of different ages, sexes, and regions of birth. For example, the algorithm labelled as “recognito_001” produced a false match rate for images of females over 65 born in West African countries 3000 times the false match rate for images of males ages 20-35 born in Eastern European countries. NIST notes that “While this table lists results for 1:1 algorithms, it will have relevance to that subset of 1:N algorithms that implement 1:N search as N 1:1 comparisons followed by a sort operation. The demographic effects noted here will be material in 1:N operations and will be magnified if the gallery and the search stream include the affected demographic.” /h4 /div div class=rss-cta__titleWe need you with us to keep fighting/diva href=https://action.aclu.org/give/now class=rss-cta__buttonDonate today/a/div
❌
❌