Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

States Dust Off Obscure Anti-Mask Laws to Target Pro-Palestine Protesters

pArcane laws banning people from wearing masks in public are now being used to target people who wear face coverings while peacefully protesting Israel’s war in Gaza. That’s a big problem./p pIn the 1940s and 50s, many U.S. states passed anti-mask laws as a response to the Ku Klux Klan, whose members often hid their identities as they terrorized their victims. These laws were not enacted to protect those victims, but because political leaders wanted to defend segregation as part of a “modern South” and felt that the Klan’s violent racism was making them look bad./p pNow these laws are being used across the country to try and clamp down on disfavored groups and movements, raising questions about selective prosecution. Just this month, Ohio Attorney General Dave Yost a href=https://www.latimes.com/world-nation/story/2024-05-08/masked-student-protesters-could-face-felony-charges-under-anti-kkk-law-ohio-attorney-general-warnssent a letter/a to the state’s 14 public universities alerting them that protesters could be charged with a felony under the state’s little-used anti-mask law, which carries penalties of between six to 18 months in prison. An Ohio legal expert, Rob Barnhart, observed that he’d a href=https://www.wosu.org/politics-government/2024-05-07/protesters-could-face-felony-charge-if-arrested-while-wearing-a-mask-under-obscure-ohio-lawnever heard/a of the state’s law being applied previously, even to bank robbers wearing masks. While Yost framed his letter as “proactive guidance,” Barnhart countered that “I find it really hard to believe that this is some public service announcement to students to be aware of a 70-year-old law that nobody uses.”/p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/free-speech/americas-mask-bans-in-the-age-of-face-recognition-surveillance target=_blank tabindex=-1 img width=1200 height=628 src=https://www.aclu.org/wp-content/uploads/2024/05/5b813d014d9877b39c43e882a1782bed.jpg class=attachment-4x3_full size-4x3_full alt= decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2024/05/5b813d014d9877b39c43e882a1782bed.jpg 1200w, https://www.aclu.org/wp-content/uploads/2024/05/5b813d014d9877b39c43e882a1782bed-768x402.jpg 768w, https://www.aclu.org/wp-content/uploads/2024/05/5b813d014d9877b39c43e882a1782bed-400x209.jpg 400w, https://www.aclu.org/wp-content/uploads/2024/05/5b813d014d9877b39c43e882a1782bed-600x314.jpg 600w, https://www.aclu.org/wp-content/uploads/2024/05/5b813d014d9877b39c43e882a1782bed-800x419.jpg 800w, https://www.aclu.org/wp-content/uploads/2024/05/5b813d014d9877b39c43e882a1782bed-1000x523.jpg 1000w sizes=(max-width: 1200px) 100vw, 1200px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/free-speech/americas-mask-bans-in-the-age-of-face-recognition-surveillance target=_blank America's Mask Bans in the Age of Face Recognition Surveillance /a /div div class=wp-link__description a href=https://www.aclu.org/news/free-speech/americas-mask-bans-in-the-age-of-face-recognition-surveillance target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletAmerican laws should allow people the freedom to cover up their faces in protests or anywhere else./p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/free-speech/americas-mask-bans-in-the-age-of-face-recognition-surveillance target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pOhio officials aren’t the only ones who seem to be selectively enforcing anti-mask laws against student protestors. Administrators at the University of North Carolina a href=https://chapelboro.com/news/unc/unc-asks-pro-palestine-protesters-to-stop-wearing-masks-citing-1953-anti-kkk-lawhave warned/a protesters that wearing masks violates the state’s anti-mask law and “runs counter to our campus norms and is a violation of UNC policy.” Students arrested during a protest at the University of Florida were a href=https://www.sun-sentinel.com/2024/04/29/police-make-first-arrests-in-florida-of-pro-palestinian-protesters-at-two-university-campuses/charged with/a, among other things, wearing masks in public. At the University of Texas at Austin, Gov. Greg Abbott and university officials called in state troopers to a href=https://www.texastribune.org/2024/04/29/university-texas-pro-palestinian-protest-arrest/violently/a break up pro-Palestinian protests after the school a href=https://www.houstonchronicle.com/politics/texas/article/ut-austin-police-protest-arrests-19422645.phprescinded permission/a for a rally on the grounds that protesters had a “declared intent to violate our policies and rules.” One of the rules the administrators cited was a university ban on wearing face masks “to obstruct law enforcement.”/p pAt a time when both public and private actors are increasingly turning to invasive surveillance technologies to identify protesters, mask-wearing is an important way for us to safeguard our right to speak out on issues of public concern. While the ACLU has raised concerns about how anti-mask laws have been wielded for decades, we are especially worried about the risk they pose to our constitutional freedoms in the digital age./p pIn particular, the emergence of face recognition technology has changed what it means to appear in public. Increasingly omnipresent cameras and corrosive technology products such as a href=https://www.nytimes.com/interactive/2021/03/18/magazine/facial-recognition-clearview-ai.htmlClearview AI/a allow police to easily identify people. So, too, can private parties. The push to normalize face recognition by security agencies threatens to turn our faces into the functional equivalent of license plates. Anti-mask laws are in effect a requirement to display those “plates” anytime one is in public. Humans are not cars./p pOf course, mask-wearing is not just about privacy — it can also be an expressive act, a religious practice, a political statement, or a public-health measure. The ACLU has chronicled the a href=https://www.aclu.org/news/free-speech/americas-mask-bans-in-the-age-of-face-recognition-surveillancemask-wearing debate/a for years. As recently as 2019, anti-mask laws were used against a href=https://www.theatlantic.com/national/archive/2011/09/nypd-arresting-wall-street-protesters-wearing-masks/337706/Occupy Wall Street/a protesters,a href=https://www.ajc.com/news/state--regional/white-nationalist-richard-spencer-riles-auburn-campus-three-arrested/5HeaD0TCfvfNI7DuXDUciJ/ anti-racism/aa href=https://wtvr.com/2017/09/19/mask-in-public-court-hearing/ protesters/a, anda href=https://wbhm.org/feature/2019/experts-alabamas-mask-law-is-outdated/ police violence/a protesters. The coronavirus temporarily scrambled the mask-wearing debate and made a mask both a protective and a a href=https://apnews.com/article/virus-outbreak-donald-trump-ap-top-news-politics-health-7dce310db6e85b31d735e81d0af6769cpolitical/a act./p pToday, one question that remains is whether and how the authorities distinguish between those who are wearing a mask to protect their identities and those who are wearing one to protect themselves against disease. That ambiguity opens up even more space for discretionary and selective enforcement. In North Carolina, the state Senate is currently considering an anti-protest bill that would remove the exception for wearing a mask for health purposes altogether, and would add a sentencing enhancement for committing a crime while wearing a mask./p pFor those speaking out in support of the Palestinian people, being recognized in a crowd can have extreme consequences for their personal and professional security. During the Gaza protests, pro-Israel activists and organizations have posted the faces and personal information of pro-Palestine activists to intimidate them, get them fired, or otherwise shame them for their views. These doxing attempts have intensified, with viral videos showing counterprotesters demanding that pro-Palestinian protesters remove their masks at rallies. Professionally, employers have a href=https://www.thecut.com/2023/10/israel-hamas-war-job-loss-social-media.htmlterminated workers/a for their comments about Israel and Palestine, and CEOs have a href=https://finance.yahoo.com/news/bill-ackman-wants-harvard-name-104621975.htmldemanded/a universities give them the names of protesters in order to blacklist them from jobs./p pWhile wearing a mask can make it harder to identify a person, it#8217;s important for protesters to know that it’s not always effective. Masks haven’t stopped the a href=https://www.nytimes.com/2022/12/02/business/china-protests-surveillance.htmlChinese government/a or a href=https://www.cbsnews.com/sanfrancisco/news/google-workers-fired-after-protesting-israeli-contract-file-complaint-labor-regulators/Google/a, for example, from identifying protesters and taking action against them. Technologies that can be used to identify masked protesters range froma href=https://www.notus.org/technology/war-zone-surveillance-border-us Bluetooth and WiFi signals/a, to historical cell phone location data, to constitutionally dubious devices calleda href=https://www.aclu.org/news/privacy-technology/police-citing-terrorism-buy-stingrays-used-only IMSI Catchers/a, which pretend to be a cell tower and ping nearby phones, prompting phones to reply with an identifying ping of their own. We may also see the development of a href=https://www.aclu.org/publications/dawn-robot-surveillancevideo analytics/a technologies that use gait recognition or body-proportion measurements. During Covid, face recognition also got a href=https://www.bbc.com/news/technology-56517033much/aa href=https://www.zdnet.com/article/facial-recognition-now-algorithms-can-see-through-face-masks/ better/a at identifying people wearing partial face masks./p pProtecting people’s freedom to wear masks can have consequences. It can make it harder to identify people who commit crimes, whether they are bank robbers, muggers, or the members of the “a href=https://www.latimes.com/california/story/2024-05-07/a-ucla-timeline-from-peaceful-encampment-to-violent-attacks-aftermathviolent mob/a” that attacked a peaceful protest encampment at UCLA. Like all freedoms, the freedom to wear a mask can be abused. But that does not justify taking that freedom away from those protesting peacefully, especially in today’s surveillance environment./p pAnti-mask laws, undoubtedly, have a significant chilling effect on some protesters#8217; willingness to show up for causes they believe in. The bravery of those who do show up to support a highly-controversial cause in the current surveillance landscape is admirable, but Americans shouldn’t have to be brave to exercise their right to protest. Until privacy protections catch up with technology, officials and policymakers should do all they can to make it possible for less-brave people to show up and protest. That includes refusing to use anti-mask laws to target peaceful protestors./p

Police Say a Simple Warning Will Prevent Face Recognition Wrongful Arrests. That's Just Not True.

pFace recognition technology in the hands of police is dangerous. Police departments across the country frequently use the technology to try to identify images of unknown suspects by comparing them to large photo databases, but it often fails to generate a correct match. And numerous a href=https://www.washingtonpost.com/technology/2019/12/19/federal-study-confirms-racial-bias-many-facial-recognition-systems-casts-doubt-their-expanding-use/studies/a have shown that face recognition technology misidentifies Black people and other people of color at higher rates than white people. To date, there have been at least seven wrongful arrests we know of in the United States due to police reliance on incorrect face recognition results — and those are just the known cases. In nearly every one of those instances, a href=https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.htmlthe/a a href=https://www.nytimes.com/2023/03/31/technology/facial-recognition-false-arrests.htmlperson/a a href=https://www.newyorker.com/magazine/2023/11/20/does-a-i-lead-police-to-ignore-contradictory-evidence/wrongfully/a a href=https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.htmlarrested/a a href=https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.htmlwas/a a href=https://www.freep.com/story/news/local/michigan/detroit/2020/07/10/facial-recognition-detroit-michael-oliver-robert-williams/5392166002/Black/a./p pSupporters of police using face recognition technology often portray these failures as unfortunate mistakes that are unlikely to recur. Yet, they keep coming. Last year, six Detroit police officers showed up at the doorstep of an a href=https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.htmleight-months pregnant woman/a and wrongfully arrested her in front of her children for a carjacking that she could not plausibly have committed. A month later, the prosecutor dismissed the case against her./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank tabindex=-1 img width=2800 height=1400 src=https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed.jpg class=attachment-4x3_full size-4x3_full alt=Robert Williams and his daughter, Rosie Williams decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed.jpg 2800w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-768x384.jpg 768w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1536x768.jpg 1536w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-2048x1024.jpg 2048w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-400x200.jpg 400w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-600x300.jpg 600w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-800x400.jpg 800w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1000x500.jpg 1000w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1200x600.jpg 1200w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1400x700.jpg 1400w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1600x800.jpg 1600w sizes=(max-width: 2800px) 100vw, 2800px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank I Did Nothing Wrong. I Was Arrested Anyway. /a /div div class=wp-link__description a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletOver a year after a police face recognition tool matched me to a crime I did not commit, my family still feels the impact. We must stop this.../p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pPolice departments should be doing everything in their power to avoid wrongful arrests, which can turn people’s lives upside down and result in loss of work, inability to care for children, and other harmful consequences. So, what’s behind these repeated failures? As the ACLU explained in a a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13erecent submission/a to the federal government, there are multiple ways in which police use of face recognition technology goes wrong. Perhaps most glaring is that the most widely adopted police policy designed to avoid false arrests in this context emsimply does not work/em. Records from the wrongful arrest cases demonstrate why./p pIt has become standard practice among police departments and companies making this technology to warn officers that a result from a face recognition search does not constitute a positive identification of a suspect, and that additional investigation is necessary to develop the probable cause needed to obtain an arrest warrant. For example, the International Association of Chiefs of Police a href=https://www.theiacp.org/sites/default/files/2019-10/IJIS_IACP%20WP_LEITTF_Facial%20Recognition%20UseCasesRpt_20190322.pdfcautions/a that a face recognition search result is “a strong clue, and nothing more, which must then be corroborated against other facts and investigative findings before a person can be determined to be the subject whose identity is being sought.” The Detroit Police Department’s face recognition technology a href=https://detroitmi.gov/sites/detroitmi.localhost/files/2020-10/307.5%20Facial%20Recognition.pdfpolicy/a adopted in September 2019 similarly states that a face recognition search result is only an “an investigative lead and IS NOT TO BE CONSIDERED A POSITIVE IDENTIFICATION OF ANY SUBJECT. Any possible connection or involvement of any subject to the investigation must be determined through further investigation and investigative resources.”/p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 /a /div div class=wp-link__title a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank ACLU Comment re: Request for Comment on Law Enforcement Agencies' Use of Facial Recognition Technology, Other Technologies Using Biometric Information, and Predictive Algorithms (Exec. Order 14074, Section 13(e)) /a /div div class=wp-link__description a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tablet/p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pPolice departments across the country, from a href=https://lacris.org/LACRIS Facial Recognition Policy v_2019.pdfLos Angeles County/a to the a href=https://www.in.gov/iifc/files/Indiana_Intelligence_Fusion_Center_Face_Recognition_Policy.pdfIndiana State Police/a, to the U.S. a href=https://www.dhs.gov/sites/default/files/2023-09/23_0913_mgmt_026-11-use-face-recognition-face-capture-technologies.pdfDepartment of Homeland Security/a, provide similar warnings. However ubiquitous, these warnings have failed to prevent harm./p pWe’ve seen police treat the face recognition result as a positive identification, ignoring or not understanding the warnings that face recognition technology is simply not reliable enough to provide a positive identification./p pIn Louisiana, for example, police relied solely on an incorrect face recognition search result from Clearview AI as purported probable cause for an arrest warrant. The officers did this even though the law enforcement agency signed a contract with the face recognition company acknowledging officers “must conduct further research in order to verify identities or other data generated by the [Clearview] system.” That overreliance led to a href=https://www.nytimes.com/2023/03/31/technology/facial-recognition-false-arrests.htmlRandal Quran Reid/a, a Georgia resident who had never even been to Louisiana, being wrongfully arrested for a crime he couldn’t have committed and held for nearly a week in jail./p pIn an a href=https://www.courierpress.com/story/news/local/2023/10/19/evansville-police-using-clearview-ai-facial-recognition-to-make-arrests/70963350007/Indiana investigation/a, police similarly obtained an arrest warrant based only upon an assertion that the detective “viewed the footage and utilized the Clearview AI software to positively identify the female suspect.” No additional confirmatory investigation was conducted./p pBut even when police do conduct additional investigative steps, those steps often emexacerbate and compound/em the unreliability of face recognition searches. This is a particular problem when police move directly from a facial recognition result to a witness identification procedure, such as a photographic lineup./p pFace recognition technology is designed to generate a list of faces that are emsimilar/em to the suspect’s image, but often will not actually be a match. When police think they have a match, they frequently ask a witness who saw the suspect to view a photo lineup consisting of the image derived from the face recognition search, plus five “filler” photos of other people. Photo lineups have long been known to carry a high risk of misidentification. The addition of face recognition-generated images only makes it worse. Because the face recognition-generated image is likely to appear more similar to the suspect than the filler photos, there is a a href=https://www.newyorker.com/magazine/2023/11/20/does-a-i-lead-police-to-ignore-contradictory-evidence/heightened chance/a that a witness will a href=https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4101826mistakenly choose/a that image out of the lineup, even though it is not a true match./p pThis problem has contributed to known cases of wrongful arrests, including the arrests of a href=https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.htmlPorcha Woodruff/a, a href=https://www.freep.com/story/news/local/michigan/detroit/2020/07/10/facial-recognition-detroit-michael-oliver-robert-williams/5392166002/Michael Oliver/a, and a href=https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.htmlRobert Williams/a by Detroit police (the ACLU represents Mr. Williams in a a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anywaywrongful arrest lawsuit/a). In these cases, police obtained an arrest warrant based solely on the combination of a false match from face recognition technology; and a false identification from a witness viewing a photo lineup that was constructed around the face recognition lead and five filler photos. Each of the witnesses chose the face recognition-derived false match, instead of deciding that the suspect did not, in fact, appear in the lineup./p pA lawsuit filed earlier this year in Texas alleges that a similar series of failures led to the wrongful arrest of a href=https://www.theguardian.com/technology/2024/jan/22/sunglass-hut-facial-recognition-wrongful-arrest-lawsuit?ref=upstract.comHarvey Eugene Murphy Jr./a by Houston police. And in New Jersey, police wrongfully arrested a href=https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.htmlNijeer Parks/a in 2019 after face recognition technology incorrectly flagged him as a likely match to a shoplifting suspect. An officer who had seen the suspect (before he fled) viewed the face recognition result, and said he thought it matched his memory of the suspect’s face./p pAfter the Detroit Police Department’s third wrongful arrest from face recognition technology became public last year, Detroit’s chief of police a href=https://www.facebook.com/CityofDetroit/videos/287218473992047acknowledged/a the problem of erroneous face recognition results tainting subsequent witness identifications. He explained that by moving straight from face recognition result to lineup, “it is possible to taint the photo lineup by presenting a person who looks most like the suspect” but is not in fact the suspect. The Department’s policy, merely telling police that they should conduct “further investigation,” had not stopped police from engaging in this bad practice./p pBecause police have repeatedly proved unable or unwilling to follow face recognition searches with adequate independent investigation, police access to the technology must be strictly curtailed — and the best way to do this is through strong a href=https://www.aclu.org/sites/default/files/field_document/02.16.2021_coalition_letter_requesting_federal_moratorium_on_facial_recognition.pdfbans/a. More than 20 jurisdictions across the country, from Boston, to Pittsburgh, to San Francisco, have done just that, barring police from using this dangerous technology./p pBoilerplate warnings have proven ineffective. Whether these warnings fail because of human a href=https://www.nytimes.com/2020/06/09/technology/facial-recognition-software.htmlcognitive bias/a toward trusting computer outputs, poor police training, incentives to quickly close cases, implicit racism, lack of consequences, the fallibility of witness identifications, or other factors, we don’t know. But if the experience of known wrongful arrests teaches us anything, it is that such warnings are woefully inadequate to protect against abuse./p

How to Protect Consumer Privacy and Free Speech

pTechnology is a necessity of modern life. People of all ages rely on it for everything from accessing information and connecting with others, to paying for goods, using transportation, getting work done, and speaking out about issues of the day. Without adequate privacy protections, technology can be co-opted to surveil us online and intrude on our private lives–not only by the government, but also by businesses–with grave consequences for our rights./p pThere is sometimes a misconception that shielding our personal information from this kind of misuse will violate the First Amendment rights of corporations who stand to profit from collecting, analyzing, and sharing that information. But we don’t have to sacrifice robust privacy protection to uphold anyone’s right to free speech. In fact, when done right, strong privacy protections reinforce speech rights. They create spaces where people have the confidence to exercise their First Amendment rights to candidly communicate with friends, seek out advice and community, indulge curiosity, and anonymously speak or access information./p pAt the same time, simply calling something a “privacy law” doesn’t make it so. Take the California Age Appropriate Design Code Act (CAADCA), a law currently under review by the Ninth Circuit in iNetChoice v. Bonta/i. As the ACLU and the ACLU of Northern California argued in a a href=https://www.aclu.org/cases/netchoice-llc-v-bonta?document=Amici-Curiae-Brief-of-the-ACLU-%26-ACLU-of-Northern-Californiafriend-of-the-court brief/a, this law improperly included content restrictions on online speech and is unconstitutional. Laws can and should be crafted to protect both privacy and free speech rights. It is critical that legislatures and courts get the balance right when it comes to a law that implicates our ability to control our personal information and to speak and access content online./p pConsumer privacy matters. With disturbing frequency, businesses use technology to siphon hordes of personal information from us – learning things about our health, our family situation, our financial status, our location, our age, and even our beliefs. Not only can they paint intimate portraits of our lives but, armed with this information, they can raise or lower prices depending on our demographics, make discriminatory predictions about a href=https://www.wired.com/story/argentina-algorithms-pregnancy-prediction/health outcomes/a, improperly deny a href=https://www.hud.gov/sites/dfiles/Main/documents/HUD_v_Facebook.pdfhousing/a or a href=https://www.cnn.com/2023/06/12/tech/facebook-job-ads-gender-discrimination-asequals-intl-cmd/index.htmljobs/a, a href=https://www.propublica.org/article/health-insurers-are-vacuuming-up-details-about-you-and-it-could-raise-your-rateshike insurance rates/a, and flood people of color and low-income people with a href=https://www.nytimes.com/2011/09/08/opinion/fair-lending-and-accountability.htmlads for predatory loans/a./p pAll this nefarious behavior holds serious consequences for our financial stability, our health, our quality of life, and our civil rights, including our First Amendment rights. Better consumer privacy gives advocates, activists, whistleblowers, dissidents, authors, artists, and others the confidence to speak out. Only when people are free from the fear that what they’re doing online is being monitored and shared can they feel free to enjoy the full extent of their rights to read, investigate, discuss, and be inspired by whatever they want./p pYet in recent years, tech companies have argued that consumer privacy protections limit their i /iFirst Amendment rights to collect, use, and share people’s personal information. These arguments are often faulty. Just because someone buys a product or signs up for a service, that doesn’t give the company providing that good or service the First Amendment right to share or use the personal information they collect from that person however they want./p pTo the contrary, laws that require data minimization and high privacy settings by default are good policy and can easily pass First Amendment muster. Arguments to the contrary not only misunderstand the First Amendment; they’d actually weaken its protections./p pLaws that suppress protected speech in order to stop children from accessing certain types of content generally often hurt speech and privacy rights for all. That’s why First Amendment challenges to laws a href=https://www.aclu.org/news/free-speech/arkansas-wants-to-unconstitutionally-card-people-before-they-use-social-mediathat limit what we can see online/a typically succeed. The Supreme Court has made it clear time and again that the government cannot regulate speech solely to stop children from seeing ideas or images that a legislative body believes to be unsuitable. Nor can it limit adults’ access to speech in the name of shielding children from certain content./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/free-speech/arkansas-wants-to-unconstitutionally-card-people-before-they-use-social-media target=_blank tabindex=-1 img width=1200 height=628 src=https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10.jpg class=attachment-original size-original alt= decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10.jpg 1200w, https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10-768x402.jpg 768w, https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10-400x209.jpg 400w, https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10-600x314.jpg 600w, https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10-800x419.jpg 800w, https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10-1000x523.jpg 1000w sizes=(max-width: 1200px) 100vw, 1200px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/free-speech/arkansas-wants-to-unconstitutionally-card-people-before-they-use-social-media target=_blank Arkansas Wants to Unconstitutionally “Card” People Before They Use Social Media /a /div div class=wp-link__description a href=https://www.aclu.org/news/free-speech/arkansas-wants-to-unconstitutionally-card-people-before-they-use-social-media target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletThe state’s Social Media Safety Act stifles freedom of expression online and violates the First Amendment./p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/free-speech/arkansas-wants-to-unconstitutionally-card-people-before-they-use-social-media target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pThe CAADCA is unconstitutional for these reasons, despite the legislature’s understandable concerns about the privacy, wellbeing, and safety of children. The law was drafted so broadly that it actually would have hurt children. It could have prevented young people and adults from accessing things like online mental health resources; support communities related to school shootings and suicide prevention; and reporting about war, the climate crisis, and gun violence. It also could interfere with students#8217; attempts to express political or religious speech, or provide and receive personal messages about deaths in the family, rejection from a college, or a breakup. Paradoxically, the law exposes everyone’s information to greater privacy concerns by encouraging companies to gather and analyze user data for age estimation purposes./p pWhile we believe that the CAADCA burdens free speech and should be struck down, it is important that the court not issue a ruling that forecloses a path that other privacy laws could take to protect privacy without violating the First Amendment. We need privacy and free speech, too, especially in the digital age./p

Communities Should Reject Surveillance Products Whose Makers Won't Allow Them to be Independently Evaluated

6 March 2024 at 10:05
pAmerican communities are being confronted by a lot of new police technology these days, a lot of which involves surveillance or otherwise raises the question: “Are we as a community comfortable with our police deploying this new technology?” A critical question when addressing such concerns is: “Does it even work, and if so, how well?” It’s hard for communities, their political leaders, and their police departments to know what to buy if they don’t know what works and to what degree./p pOne thing I’ve learned from following new law enforcement technology for over 20 years is that there is an awful lot of snake oil out there. When a new capability arrives on the scene—whether it’s a href=https://www.aclu.org/wp-content/uploads/publications/drawing_blank.pdfface recognition/a, a href=https://www.aclu.org/blog/privacy-technology/surveillance-technologies/experts-say-emotion-recognition-lacks-scientific/emotion recognition/a, a href=https://www.aclu.org/wp-content/uploads/publications/061819-robot_surveillance.pdfvideo analytics/a, or “a href=https://www.aclu.org/news/privacy-technology/chicago-police-heat-list-renews-old-fears-aboutbig data/a” pattern analysis—some companies will rush to promote the technology long before it is good enough for deployment, which sometimes a href=https://www.aclu.org/blog/privacy-technology/surveillance-technologies/experts-say-emotion-recognition-lacks-scientific/never happens/a. That may be even more true today in the age of artificial intelligence. “AI” is a term that often amounts to no more than trendy marketing jargon./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank tabindex=-1 img width=1200 height=628 src=https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7.jpg class=attachment-original size-original alt= decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7.jpg 1200w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-768x402.jpg 768w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-400x209.jpg 400w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-600x314.jpg 600w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-800x419.jpg 800w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-1000x523.jpg 1000w sizes=(max-width: 1200px) 100vw, 1200px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank Six Questions to Ask Before Accepting a Surveillance Technology /a /div div class=wp-link__description a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletCommunity members, policymakers, and political leaders can make better decisions about new technology by asking these questions./p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pGiven all this, communities and city councils should not adopt new technology that has not been subject to testing and evaluation by an independent, disinterested party. That’s true for all types of technology, but doubly so for technologies that have the potential to change the balance of power between the government and the governed, like surveillance equipment. After all, there’s no reason to get a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technologywrapped up in big debates/a about privacy, security, and government power if the tech doesn’t even work./p pOne example of a company refusing to allow independent review of its product is the license plate recognition company Flock, which is pushing those surveillance devices into many American communities and tying them into a centralized national network. (We wrote more about this company in a 2022 a href=https://www.aclu.org/publications/fast-growing-company-flock-building-new-ai-driven-mass-surveillance-systemwhite paper/a.) Flock has steadfastly refused to allow the a href=https://www.aclu.org/news/privacy-technology/are-gun-detectors-the-answer-to-mass-shootingsindependent/a security technology reporting and testing outlet a href=https://ipvm.com/IPVM/a to obtain one of its license plate readers for testing, though IPVM has tested all of Flock’s major competitors. That doesn’t stop Flock from a href=https://ipvm.com/reports/flock-lpr-city-sued?code=lfgsdfasd543453boasting/a that “Flock Safety technology is best-in-class, consistently performing above other vendors.” Claims like these are puzzling and laughable when the company doesn’t appear to have enough confidence in its product to let IPVM test it./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank tabindex=-1 img width=1160 height=768 src=https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e.jpg class=attachment-original size-original alt= decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e.jpg 1160w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-768x508.jpg 768w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-400x265.jpg 400w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-600x397.jpg 600w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-800x530.jpg 800w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-1000x662.jpg 1000w sizes=(max-width: 1160px) 100vw, 1160px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank Experts Say 'Emotion Recognition' Lacks Scientific Foundation /a /div div class=wp-link__description a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tablet/p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pCommunities considering installing Flock cameras should take note. That is especially the case when errors by Flock and other companies’ license plate readers can lead to innocent drivers finding themselves with their a href=https://ipvm.com/reports/flock-lpr-city-sued?code=lfgsdfasd543453hands behind their heads/a, facing jittery police pointing guns at them. Such errors can also expose police departments and cities to lawsuits./p pEven worse is when a company pretends that its product has been subject to independent review when it hasn’t. The metal detector company Evolv, which sells — wait for it — emAI/em metal detectors, submitted its technology to testing by a supposedly independent lab operated by the University of Southern Mississippi, and publicly touted the results of the tests. But a href=https://ipvm.com/reports/bbc-evolvIPVM/a and the a href=https://www.bbc.com/news/technology-63476769BBC/a reported that the lab, the National Center for Spectator Sports Safety and Security (a href=https://ncs4.usm.edu/NCS4/a), had colluded with Evolv to manipulate the report and hide negative findings about the effectiveness of the company’s product. Like Flock, Evolv refuses to allow IPVM to obtain one of its units for testing. (We wrote about Evolv and its product a href=https://www.aclu.org/news/privacy-technology/are-gun-detectors-the-answer-to-mass-shootingshere/a.)/p pOne of the reasons these companies can prevent a tough, independent reviewer such as IPVM from obtaining their equipment is their subscription and/or cloud-based architecture. “Most companies in the industry still operate on the more traditional model of having open systems,” IPVM Government Research Director Conor Healy told me. “But there’s a rise in demand for cloud-based surveillance, where people can store things in cloud, access them on their phone, see the cameras. Cloud-based surveillance by definition involves central control by the company that’s providing the cloud services.” Cloud-based architectures can a href=https://www.aclu.org/news/civil-liberties/major-hack-of-camera-company-offers-four-key-lessons-on-surveillanceworsen the privacy risks/a created by a surveillance system. Another consequence of their centralized control is increasing the ability of a company to control who can carry out an independent review./p pWe’re living in an era where a lot of new technology is emerging, with many companies trying to be the first to put them on the market. As Healy told me, “We see a lot of claims of AI, all the time. At this point, almost every product I see out there that gets launched has some component of AI.” But like other technologies before them, these products often come in highly immature, premature, inaccurate, or outright deceptive forms, relying little more than on the use of “AI” as a buzzword./p pIt’s vital for independent reviewers to contribute to our ongoing local and national conversations about new surveillance and other police technologies. It’s unclear why a company that has faith in its product would attempt to block independent review, which is all the more reason why buyers should know this about those companies./p

Dozens of Police Agencies in California Are Still Sharing Driver Locations with Anti-Abortion States. We're Fighting Back.

pOver the last decade, California has built up some of the nation’s strongest driver privacy protections, thanks to the hard work of activists, civil rights groups, and elected leaders./p pOne law in particular, often called a href=https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200SB34SB 34/a, prohibits police from circulating detailed maps of people’s driving patterns with the federal government and agencies in other states– a protection that has only grown more important with the end of iRoe v. Wade/i and the subsequent surge in abortion criminalization./p pBut dozens of California police departments have decided to defy the law, even after receiving a href=https://oag.ca.gov/system/files/media/2023-dle-06.pdfclear guidance/a from California Attorney General Rob Bonta, the chief law enforcement officer in the state. Last month the ACLU of Northern California and our partners a href=https://www.aclunc.org/sites/default/files/2024-01-31_letter_to_ag_bonta_re_sb_34_final.pdfsent Attorney General Bonta a letter/a listing 35 police agencies that have refused to comply with the law and protect driver privacy./p pWe should all be able to drive to a doctor’s office, place of worship, or political rally without being tracked and cataloged by police agencies. But for years now, police have used automated license plate readers (ALPRs) to record and track the movements of drivers on a previously unseen scale. These a href=https://www.aclu.org/documents/you-are-being-tracked-how-license-plate-readers-are-being-used-record-americans-movementssystems/a allow police to collect and store information about drivers whose cars pass through ALPR cameras’ fields of view, which, along with the date and time of capture, can reveal sensitive details about our movements and, as a result, our private lives./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/documents/you-are-being-tracked-how-license-plate-readers-are-being-used-record-americans-movements target=_blank tabindex=-1 img width=1120 height=788 src=https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM.png class=attachment-4x3_full size-4x3_full alt=A highway with fast moving cars. decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM.png 1120w, https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM-768x540.png 768w, https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM-400x281.png 400w, https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM-600x422.png 600w, https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM-800x563.png 800w, https://www.aclu.org/wp-content/uploads/2024/02/Screen-Shot-2024-02-13-at-12.35.20-PM-1000x704.png 1000w sizes=(max-width: 1120px) 100vw, 1120px / /a /div div class=wp-link__title a href=https://www.aclu.org/documents/you-are-being-tracked-how-license-plate-readers-are-being-used-record-americans-movements target=_blank You Are Being Tracked: How License Plate Readers Are Being Used to Record Americans' Movements /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/documents/you-are-being-tracked-how-license-plate-readers-are-being-used-record-americans-movements target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pThe ACLU has long seen the danger ALPR surveillance poses, and working alongside communities on the ground, has fought to bolster California’s legal protections for driver privacy. For over a decade, we have conducted investigations, advocacy, and litigation focused on how police agencies use ALPR to track law-abiding drivers, amass hordes of sensitive information, and use it to harm people./p pIn the wake of a href=http://chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://www.aclu.org/files/assets/071613-aclu-alprreport-opt-v05.pdfACLU’s groundbreaking report/a on ALPR across the US, a href=https://www.aclunc.org/blog/use-automated-license-plate-readers-expanding-northern-california-and-data-shared-fedswe called out/a police use of ALPRs in 2013 as a threat to driver privacy and warned that California lacked statewide driver privacy protections. In 2016, thanks in part to the advocacy of the ACLU and a href=https://www.eff.org/deeplinks/2015/10/success-sacramento-four-new-laws-one-veto-all-victories-privacy-and-transparencyallies/a, the California legislature passed a href=https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200SB34SB 34/a, the law at issue today. In a href=https://www.aclu.org/news/immigrants-rights/documents-reveal-ice-using-driver-location-data2019/a we discovered Immigration and Customs Enforcement’s (ICE) exploitation of ALPR-collected information to track and target immigrants in California and across the United States./p pFrom there, we took action to enforce California’s driver privacy protections. In a href=https://www.aclunc.org/news/california-activists-sue-marin-county-sheriff-illegally-sharing-drivers-license-plate-data-ice2021/a we sued Marin County, California for illegally sharing millions of local drivers’ license plates and locations with federal and out-of-state agencies, including ICE. The sheriff eventually agreed to comply with SB 34 as part of a a href=https://www.aclunc.org/our-work/legal-docket/lagleva-v-doyle-license-plate-surveillance#:~:text=In%20May%202022%2C%20the%20plaintiffs,54.settlement agreement/a, but we believed that many other California police agencies were still violating SB 34./p pWe rang the alarm again in the wake of the iDobbs /idecision overturning iRoe v. Wade./i Alongside our partners at the Electronic Frontier Foundation and ACLU of Southern California, we a href=https://www.aclunc.org/news/civil-liberties-groups-demand-california-police-stop-sharing-drivers-location-data-police-antisent letters to over 70 law enforcement agencies in California/a demanding they stop sharing people’s driving patterns with states that have criminalized abortion care. We also notified the attorney general’s office of these violations./p pFollowing our letters, the attorney general issued a href=https://oag.ca.gov/system/files/media/2023-dle-06.pdfinstructions/a to police across the state to follow SB 34’s plain text and cease sharing license plate information with state and federal agencies outside California. While some agencies have come into compliance, many police are digging in and refusing to follow the law. Police lobbyists have even a href=https://www.eff.org/files/2024/01/23/bulletin_reponse_letter.03_jrt_final.khb_.02.pdfasked/a the attorney general to withdraw his interpretation of the law./p pSimply put, the position touted by police agencies and their lobbyists puts Californians at risk. SB 34 is important because when police track and share the locations of law-abiding drivers, that information can easily be used to facilitate racist policing, a href=https://www.buzzfeednews.com/article/alexcampbell/the-ticket-machinepunitive fees/a, and the a href=https://www.ap.org/ap-in-the-news/2012/with-cameras-informants-nypd-eyed-mosquesdiscriminatory targeting/a of people in California and beyond. And, as a href=https://www.eff.org/files/2023/05/24/tracy.pdfour letters warned/a, when California shares ALPR information with authorities in states with anti-abortion or anti-trans laws, police and prosecutors gain new power to track and prosecute people who traveled to California to receive reproductive or gender-affirming care./p pWe should all be able to travel safely on the state’s roads without our movements being handed to authorities outside the state. That is why we have continued to push California police agencies to follow California’s driver privacy law. And it’s why we have supported localities a href=https://www.aclunc.org/blog/alameda-rejects-surveillance-deal-company-tied-icethat reject/a ALPR programs at odds with their values./p pIt is unacceptable that police agencies charged with enforcing laws are refusing to comply with this one. While we are pleased with Attorney General Bonta’s strong statement on SB 34, we urge the attorney general to use all available means at his disposal to ensure compliance. And rest assured, that the ACLU will continue fighting to enact and enforce protections that keep all of us safe, no matter where we go in the state./p piThis article was a href=https://www.aclunc.org/blog/californians-fought-hard-driver-privacy-protections-why-are-police-refusing-follow-themoriginally featured/a on the blog of the ACLU of Northern California./i/p div class=rss-cta__titleWe need you with us to keep fighting/diva href=https://action.aclu.org/give/now class=rss-cta__buttonDonate today/a/div

When it Comes to Facial Recognition, There is No Such Thing as a Magic Number

pWe often hear about government misuse of face recognition technology (FRT) and how it can a href=https://www.wired.com/story/wrongful-arrests-ai-derailed-3-mens-lives/derail/a a person’s life through wrongful arrests and other harms. Despite mounting evidence, government agencies continue to push face recognition systems on communities across the United States. Key to this effort are the corporate makers and sellers who market this technology as reliable, accurate, and safe – often by pointing to their products’ scores on government-run performance tests./p pAll of this might tempt policymakers to believe that the safety and civil rights problems of facial recognition can be solved by mandating a certain performance score or grade. However, relying solely on test scores risks obscuring deeper problems with face recognition while overstating its effectiveness and real-life safety./p div class=wp-heading mb-8 h2 id= class=wp-heading-h2 with-standardHow are facial recognition systems tested? /h2 /div pMany facial recognition systems are tested by the federal National Institute of Standards and Technology (NIST). In one of their tests, NIST uses companies’ algorithms to try and search for a face within a large “matching database” of faces. In broad strokes, this test appears to resemble how police use face recognition today, feeding an image of a single unknown person’s face into an algorithm that compares it against a large database of mugshot or driver’s license photos and generates suggested images, paired with numbers that represent estimates of how similar the images are./p pThese and other tests can reveal disturbing racial disparities. In their own a href=http://gendershades.org/overview.htmlgroundbreaking research/a, computer scientists Dr. Joy Buolamwini and Dr. Timnit Gebru tested several prominent gender classification algorithms, and a href=https://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdffound that/a those systems were less likely to accurately classify the faces of women with darker complexions. Following that, the ACLU of Northern California performed its own test of Amazon’s facial recognition software, which a href=https://www.aclu.org/news/privacy-technology/amazons-face-recognition-falsely-matched-28falsely matched/a the faces of 28 members of Congress with faces in a mugshot database, with Congressmembers of color being misidentified at higher rates. Since then, additional testing by a href=https://pages.nist.gov/frvt/reports/demographics/nistir_8429.pdfNIST/a and a href=https://openaccess.thecvf.com/content/WACV2023W/DVPBA/papers/Bhatta_The_Gender_Gap_in_Face_Recognition_Accuracy_Is_a_Hairy_WACVW_2023_paper.pdfacademic researchers/a indicates that these problems persist./p pWhile testing of facial recognition for accuracy and fairness across race, sex, and other characteristics is critical, the tests do not take full account of practical realities. There is no laboratory test that represents the conditions and reality of a href=https://www.aclu.org/cases/parks-v-mccormac?document=Amicus-Briefhow police use face recognition/a in real world-scenarios. For one, testing labs are not going to have access to the exact “matching database,” the particular digital library of faces on mugshots, licenses, and surveillance photos, that police in a specific community search through when they operate face recognition. And tests cannot account for the full range of low-quality images from surveillance cameras (a href=https://www.wired.com/story/parabon-nanolabs-dna-face-models-police-facial-recognition/and truly dubious sources/a) that police feed into these systems, or the trouble police have when visually reviewing and choosing from a set of possible matches produced by the technology./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/cases/parks-v-mccormac target=_blank tabindex=-1 img width=700 height=350 src=https://www.aclu.org/wp-content/uploads/2023/06/01b485e2a16c02cde9cc7378926d513c.jpg class=attachment-original size-original alt= decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2023/06/01b485e2a16c02cde9cc7378926d513c.jpg 700w, https://www.aclu.org/wp-content/uploads/2023/06/01b485e2a16c02cde9cc7378926d513c-400x200.jpg 400w, https://www.aclu.org/wp-content/uploads/2023/06/01b485e2a16c02cde9cc7378926d513c-600x300.jpg 600w sizes=(max-width: 700px) 100vw, 700px / /a /div div class=wp-link__title a href=https://www.aclu.org/cases/parks-v-mccormac target=_blank Parks v. McCormac /a /div div class=wp-link__description a href=https://www.aclu.org/cases/parks-v-mccormac target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletOn January 29, 2024, the ACLU and the ACLU of New Jersey filed an amicus brief in the U.S. District Court for the District of New Jersey in support of/p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/cases/parks-v-mccormac target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pIn response to these real concerns, vendors routinely hold up their performance on tests in their a href=https://www.clearview.ai/post/debunking-the-three-biggest-myths-about-clearview-aimarketing to government agencies/a as evidence of facial recognition’s reliability and accuracy. Lawmakers have also a href=https://legiscan.com/CA/text/AB642/id/2796168sought to legislate performance scores/a that set across-the-board accuracy or error-rate requirements for facial recognition algorithms used by police that would allow police to use FRT systems that clear these requirements. This approach would be misguided./p div class=wp-heading mb-8 h2 id= class=wp-heading-h2 with-standardHow can performance scores be misleading? /h2 /div pIt is easy to be misled by performance scores. Imagine a requirement that police can only use systems that produce an overall true positive rate, a measure of how often the results returned by a FRT system include a match for the person depicted in the probe image when there is a matching image in the database, above 98 percent in testing. At first glance, that might sound like a pretty strong requirement — but a closer look reveals a very different story./p pFor one, police typically configure and customize facial recognition systems to return a list of multiple results, sometimes as many as hundreds of results. Think of this as a ‘digital lineup.’ In NIST testing, if at least one of the results returned is a match for the probe image, the search is considered successful and counted as part of the true positive rate metric. But even when this happens in practice — which certainly isn’t always the case — there is no guarantee that police will select the true match rather than one of the other results. True matches in testing might be crowded out by false matches in practice because of these police-created ‘digital lineups.’ This alone makes it difficult to choose one universal performance score that can be applied to many different FRT systems./p pLet’s look at another metric called the false positive rate, which assesses how often a FRT search will return results when there is no matching image in the database. Breaking results down by race, the same algorithm that produces the 98 percent true positive rate overall can also produce a false positive rate for Black men several times the false positive rate for white men — and an even higher false positive rate for Black women. This example is not merely a hypothetical: in a href=https://nvlpubs.nist.gov/nistpubs/ir/2019/nist.ir.8280.pdfNIST testing,/a many algorithms have exhibited this pattern. (1) a href=https://pages.nist.gov/frvt/html/frvt_demographics.htmlOther recent NIST testing/a also shows algorithms produced false positive rates tens or hundreds of times higher for females older than 65 born in West African countries than for males ages 20-35 born in Eastern European countries. (2)/p pBy only considering the true positive rate, we miss these extreme disparities, which can lead to devastating consequences. Across the United States, police are arresting people based on false matches and harming people like a href=https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.htmlNijeer Parks/a, a Black man who police in New Jersey falsely arrested and held in jail for ten days because police trusted the results of face recognition, overlooking obvious exonerating evidence. Human mis-reliance on face recognition is already a problem; focusing on performance scores might make things worse./p div class=wp-heading mb-8 h2 id= class=wp-heading-h2 with-standardWhat’s the takeaway for policymakers? /h2 /div pLawmakers should know that a facial recognition algorithm’s performance on a test cannot be easily or quickly generalized to make broad claims about whether a facial recognition algorithm is safe. Performance scores are not an easy fix to the harms that are resulting from the use of face recognition systems, and they of course don’t account for humans that will inevitably be in the loop./p pAs the ACLU explained in its recent a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13epublic comment/a to the Biden Administration, the problems of facial recognition run deep and beyond the software itself. Facial recognition is dangerous if it’s inaccurate — a problem that testing aims to address — but also dangerous even if it could hypothetically be perfectly accurate. In such a world, governments could use face surveillance to precisely track us as we leave home, attend a protest, or take public transit to the doctor’s office. This is why policymakers in an expanding list of U.S. cities and counties have decided to prohibit government use of face recognition. And it’s why a href=https://www.aclu.org/press-releases/aclu-calls-moratorium-law-and-immigration-enforcement-use-facial-recognitionACLU supports a federal moratorium/a on its use by law and immigration enforcement agencies./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 /a /div div class=wp-link__title a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank ACLU Comment re: Request for Comment on Law Enforcement Agencies' Use of Facial Recognition Technology, Other Technologies Using Biometric Information, and Predictive Algorithms (Exec. Order 14074, Section 13(e)) /a /div div class=wp-link__description a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tablet/p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pConversations about the shortcomings of performance scores are important, but instead of trying to find some magic number, policymakers should focus on how any use of facial recognition can expand discriminatory policing, massively expand the power of government, and create the conditions for authoritarian control of our private lives./p div class=wp-heading mb-8 h4 id= class=wp-heading-h4 with-standardEndnotes: /h4 /div div class=wp-heading mb-8 h4 id= class=wp-heading-h4 with-standard(1) For one demonstrative example, an FRT algorithm developed by the vendor NEC and submitted to NIST’s vendor testing program produced an overall true positive rate above 98% in some of the testing. See National Institute of Standards and Technology, Face Recognition Vendor Test Report Card for NEC-2 1, https://pages.nist.gov/frvt/reportcards/1N/nec_2.pdf (finding a false negative identification rate (FNIR) of less than .02—or 2%—for testing using multiple datasets. The true positive identification rate (TPIR) is one minus the FPIR). However, in other NIST testing, the same algorithm also produced false positive rates for Black men more than three times the false match rate for white men at various thresholds. See Patrick Grother et al., U.S. Dep’t of Com., Nat’l Inst. for Standards amp; Tech., Face Recognition Vendor Test Part 3: Demographic Effects Annex 16 at 34 fig.32, (Dec. 2019), https://pages.nist.gov/frvt/reports/demographics/annexes/annex_16.pdf. /h4 /div div class=wp-heading mb-8 h4 id= class=wp-heading-h4 with-standard(2) See National Institute of Standards and Technology, Face Recognition Technology Evaluation: Demographic Effects in Face Recognition, FRTE 1:1 Demographic Differentials Summary, False Positive Differentials, https://pages.nist.gov/frvt/html/frvt_demographics.html (Last visited February 6, 2024). The table summarizes demographic differentials in false match rates for various 1:1 algorithms and highlights that many algorithms exhibit false match rates differentials for images of people of different ages, sexes, and regions of birth. For example, the algorithm labelled as “recognito_001” produced a false match rate for images of females over 65 born in West African countries 3000 times the false match rate for images of males ages 20-35 born in Eastern European countries. NIST notes that “While this table lists results for 1:1 algorithms, it will have relevance to that subset of 1:N algorithms that implement 1:N search as N 1:1 comparisons followed by a sort operation. The demographic effects noted here will be material in 1:N operations and will be magnified if the gallery and the search stream include the affected demographic.” /h4 /div div class=rss-cta__titleWe need you with us to keep fighting/diva href=https://action.aclu.org/give/now class=rss-cta__buttonDonate today/a/div
❌
❌