Normal view

There are new articles available, click to refresh the page.
Before yesterdayAmerican Civil Liberties Union

Police Say a Simple Warning Will Prevent Face Recognition Wrongful Arrests. That's Just Not True.

pFace recognition technology in the hands of police is dangerous. Police departments across the country frequently use the technology to try to identify images of unknown suspects by comparing them to large photo databases, but it often fails to generate a correct match. And numerous a href=https://www.washingtonpost.com/technology/2019/12/19/federal-study-confirms-racial-bias-many-facial-recognition-systems-casts-doubt-their-expanding-use/studies/a have shown that face recognition technology misidentifies Black people and other people of color at higher rates than white people. To date, there have been at least seven wrongful arrests we know of in the United States due to police reliance on incorrect face recognition results — and those are just the known cases. In nearly every one of those instances, a href=https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.htmlthe/a a href=https://www.nytimes.com/2023/03/31/technology/facial-recognition-false-arrests.htmlperson/a a href=https://www.newyorker.com/magazine/2023/11/20/does-a-i-lead-police-to-ignore-contradictory-evidence/wrongfully/a a href=https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.htmlarrested/a a href=https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.htmlwas/a a href=https://www.freep.com/story/news/local/michigan/detroit/2020/07/10/facial-recognition-detroit-michael-oliver-robert-williams/5392166002/Black/a./p pSupporters of police using face recognition technology often portray these failures as unfortunate mistakes that are unlikely to recur. Yet, they keep coming. Last year, six Detroit police officers showed up at the doorstep of an a href=https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.htmleight-months pregnant woman/a and wrongfully arrested her in front of her children for a carjacking that she could not plausibly have committed. A month later, the prosecutor dismissed the case against her./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank tabindex=-1 img width=2800 height=1400 src=https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed.jpg class=attachment-4x3_full size-4x3_full alt=Robert Williams and his daughter, Rosie Williams decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed.jpg 2800w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-768x384.jpg 768w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1536x768.jpg 1536w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-2048x1024.jpg 2048w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-400x200.jpg 400w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-600x300.jpg 600w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-800x400.jpg 800w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1000x500.jpg 1000w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1200x600.jpg 1200w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1400x700.jpg 1400w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1600x800.jpg 1600w sizes=(max-width: 2800px) 100vw, 2800px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank I Did Nothing Wrong. I Was Arrested Anyway. /a /div div class=wp-link__description a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletOver a year after a police face recognition tool matched me to a crime I did not commit, my family still feels the impact. We must stop this.../p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pPolice departments should be doing everything in their power to avoid wrongful arrests, which can turn people’s lives upside down and result in loss of work, inability to care for children, and other harmful consequences. So, what’s behind these repeated failures? As the ACLU explained in a a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13erecent submission/a to the federal government, there are multiple ways in which police use of face recognition technology goes wrong. Perhaps most glaring is that the most widely adopted police policy designed to avoid false arrests in this context emsimply does not work/em. Records from the wrongful arrest cases demonstrate why./p pIt has become standard practice among police departments and companies making this technology to warn officers that a result from a face recognition search does not constitute a positive identification of a suspect, and that additional investigation is necessary to develop the probable cause needed to obtain an arrest warrant. For example, the International Association of Chiefs of Police a href=https://www.theiacp.org/sites/default/files/2019-10/IJIS_IACP%20WP_LEITTF_Facial%20Recognition%20UseCasesRpt_20190322.pdfcautions/a that a face recognition search result is “a strong clue, and nothing more, which must then be corroborated against other facts and investigative findings before a person can be determined to be the subject whose identity is being sought.” The Detroit Police Department’s face recognition technology a href=https://detroitmi.gov/sites/detroitmi.localhost/files/2020-10/307.5%20Facial%20Recognition.pdfpolicy/a adopted in September 2019 similarly states that a face recognition search result is only an “an investigative lead and IS NOT TO BE CONSIDERED A POSITIVE IDENTIFICATION OF ANY SUBJECT. Any possible connection or involvement of any subject to the investigation must be determined through further investigation and investigative resources.”/p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 /a /div div class=wp-link__title a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank ACLU Comment re: Request for Comment on Law Enforcement Agencies' Use of Facial Recognition Technology, Other Technologies Using Biometric Information, and Predictive Algorithms (Exec. Order 14074, Section 13(e)) /a /div div class=wp-link__description a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tablet/p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pPolice departments across the country, from a href=https://lacris.org/LACRIS Facial Recognition Policy v_2019.pdfLos Angeles County/a to the a href=https://www.in.gov/iifc/files/Indiana_Intelligence_Fusion_Center_Face_Recognition_Policy.pdfIndiana State Police/a, to the U.S. a href=https://www.dhs.gov/sites/default/files/2023-09/23_0913_mgmt_026-11-use-face-recognition-face-capture-technologies.pdfDepartment of Homeland Security/a, provide similar warnings. However ubiquitous, these warnings have failed to prevent harm./p pWe’ve seen police treat the face recognition result as a positive identification, ignoring or not understanding the warnings that face recognition technology is simply not reliable enough to provide a positive identification./p pIn Louisiana, for example, police relied solely on an incorrect face recognition search result from Clearview AI as purported probable cause for an arrest warrant. The officers did this even though the law enforcement agency signed a contract with the face recognition company acknowledging officers “must conduct further research in order to verify identities or other data generated by the [Clearview] system.” That overreliance led to a href=https://www.nytimes.com/2023/03/31/technology/facial-recognition-false-arrests.htmlRandal Quran Reid/a, a Georgia resident who had never even been to Louisiana, being wrongfully arrested for a crime he couldn’t have committed and held for nearly a week in jail./p pIn an a href=https://www.courierpress.com/story/news/local/2023/10/19/evansville-police-using-clearview-ai-facial-recognition-to-make-arrests/70963350007/Indiana investigation/a, police similarly obtained an arrest warrant based only upon an assertion that the detective “viewed the footage and utilized the Clearview AI software to positively identify the female suspect.” No additional confirmatory investigation was conducted./p pBut even when police do conduct additional investigative steps, those steps often emexacerbate and compound/em the unreliability of face recognition searches. This is a particular problem when police move directly from a facial recognition result to a witness identification procedure, such as a photographic lineup./p pFace recognition technology is designed to generate a list of faces that are emsimilar/em to the suspect’s image, but often will not actually be a match. When police think they have a match, they frequently ask a witness who saw the suspect to view a photo lineup consisting of the image derived from the face recognition search, plus five “filler” photos of other people. Photo lineups have long been known to carry a high risk of misidentification. The addition of face recognition-generated images only makes it worse. Because the face recognition-generated image is likely to appear more similar to the suspect than the filler photos, there is a a href=https://www.newyorker.com/magazine/2023/11/20/does-a-i-lead-police-to-ignore-contradictory-evidence/heightened chance/a that a witness will a href=https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4101826mistakenly choose/a that image out of the lineup, even though it is not a true match./p pThis problem has contributed to known cases of wrongful arrests, including the arrests of a href=https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.htmlPorcha Woodruff/a, a href=https://www.freep.com/story/news/local/michigan/detroit/2020/07/10/facial-recognition-detroit-michael-oliver-robert-williams/5392166002/Michael Oliver/a, and a href=https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.htmlRobert Williams/a by Detroit police (the ACLU represents Mr. Williams in a a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anywaywrongful arrest lawsuit/a). In these cases, police obtained an arrest warrant based solely on the combination of a false match from face recognition technology; and a false identification from a witness viewing a photo lineup that was constructed around the face recognition lead and five filler photos. Each of the witnesses chose the face recognition-derived false match, instead of deciding that the suspect did not, in fact, appear in the lineup./p pA lawsuit filed earlier this year in Texas alleges that a similar series of failures led to the wrongful arrest of a href=https://www.theguardian.com/technology/2024/jan/22/sunglass-hut-facial-recognition-wrongful-arrest-lawsuit?ref=upstract.comHarvey Eugene Murphy Jr./a by Houston police. And in New Jersey, police wrongfully arrested a href=https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.htmlNijeer Parks/a in 2019 after face recognition technology incorrectly flagged him as a likely match to a shoplifting suspect. An officer who had seen the suspect (before he fled) viewed the face recognition result, and said he thought it matched his memory of the suspect’s face./p pAfter the Detroit Police Department’s third wrongful arrest from face recognition technology became public last year, Detroit’s chief of police a href=https://www.facebook.com/CityofDetroit/videos/287218473992047acknowledged/a the problem of erroneous face recognition results tainting subsequent witness identifications. He explained that by moving straight from face recognition result to lineup, “it is possible to taint the photo lineup by presenting a person who looks most like the suspect” but is not in fact the suspect. The Department’s policy, merely telling police that they should conduct “further investigation,” had not stopped police from engaging in this bad practice./p pBecause police have repeatedly proved unable or unwilling to follow face recognition searches with adequate independent investigation, police access to the technology must be strictly curtailed — and the best way to do this is through strong a href=https://www.aclu.org/sites/default/files/field_document/02.16.2021_coalition_letter_requesting_federal_moratorium_on_facial_recognition.pdfbans/a. More than 20 jurisdictions across the country, from Boston, to Pittsburgh, to San Francisco, have done just that, barring police from using this dangerous technology./p pBoilerplate warnings have proven ineffective. Whether these warnings fail because of human a href=https://www.nytimes.com/2020/06/09/technology/facial-recognition-software.htmlcognitive bias/a toward trusting computer outputs, poor police training, incentives to quickly close cases, implicit racism, lack of consequences, the fallibility of witness identifications, or other factors, we don’t know. But if the experience of known wrongful arrests teaches us anything, it is that such warnings are woefully inadequate to protect against abuse./p
❌
❌