01/18/2025 / By Ava Grace
More and more police departments are turning to artificial intelligence-powered facial recognition software – leading to a rise in false arrests.
An investigation found that police departments nationwide are relying on facial recognition software to identify and arrest suspects, often without corroborating evidence. Shockingly, most departments are not required to disclose or document their use of this technology.
Among 23 departments with available records, 15 across 12 states made arrests based solely on AI matches, frequently violating their own internal policies that mandate additional evidence before taking such drastic action.
From the information gathered, there have been eight documented cases of wrongful arrests in the U.S. made based on information from AI facial recognition, with two of these cases previously unreported. All the charges against the individuals were eventually dismissed. The investigation notes that basic police work such as verifying alibis and comparing physical evidence could have easily prevented these wrongful arrests.
The report further warns that data on wrongful arrests from facial recognition technology is very sparse and the true scale of the problem remains unknown, as most departments lack transparency and refuse to disclose their use of AI facial recognition tech.
Facial recognition software uses algorithms to analyze and compare facial features in photographs or video footage. The technology maps unique characteristics – such as the distance between the eyes, the shape of the jawline and the contours of the nose – to create a digital “faceprint.” This faceprint is then compared against a database of images to find potential matches.
While this might sound like a foolproof system, the reality is far more complicated. Facial recognition algorithms are far from perfect. They can struggle with low-quality images, poor lighting or unusual angles.
More troublingly, these systems often exhibit racial and gender biases. Studies have shown that facial recognition software is significantly more likely to misidentify women and people of color, particularly those with darker skin tones.
This inherent bias has had devastating consequences. Take the case of Randal Reid, who spent a week in jail for a crime committed in a state he had never visited.
Or Porcha Woodruff, a visibly pregnant woman who was arrested for carjacking despite being physically incapable of committing such an act.
Robert Williams, the first documented victim of a wrongful arrest due to facial recognition, was accused of stealing thousands of dollars worth of watches — while he was driving home at the time of the alleged crime. (Related: SURVEILLANCE? U.S. expands biometric technology in airports.)
At the heart of this issue lies a deeply human problem: confirmation bias. When facial recognition software identifies a suspect, law enforcement officers are often quick to accept the result as definitive, ignoring contradictory evidence.
As Mitha Nandagopalan, a staff attorney with the Innocence Project, explains, “When police showed up to Porcha’s house, she was visibly eight months pregnant, yet there was nothing in the victim’s description of the suspect that mentioned pregnancy. The circumstances described would be very difficult for someone near the end of a pregnancy to carry out, and yet they went forward with the arrest.”
Watch this video showing how facial recognition technology is being used in airports.
This video is from the Marjory Wildcraft channel on Brighteon.com.
Air Canada rolls out facial recognition technology at boarding gates for domestic flights.
Malfunctioning facial recognition technology may put innocent individuals at risk.
Fashion company creating clothing line that shields people from AI facial recognition technology.
Sources include:
Tagged Under:
AI, artificial intelligence, crime, criminals, Facial recognition, false arrests, Police, police departments, police state, policing, surveillance, wrongful arrests
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2018 PRECRIMES.NEWS
All content posted on this site is protected under Free Speech. Precrimes.news is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Precrimes.news assumes no responsibility for the use or misuse of this material. All trademarks, registered trademarks and service marks mentioned on this site are the property of their respective owners.