Facial recognition false arrest has upended the life of Angela Lipps, a 50-year-old grandmother from Tennessee, who spent five months in custody after being wrongly identified as a suspect in a bank fraud case in North Dakota, a state she had never visited.

Lipps’ ordeal began in July when police officers arrived at her rental home and arrested her. By late October, she had been extradited over 1,000 miles to Fargo, North Dakota. Lipps later recalled that this journey was her first flight and described the experience as terrifying, exhausting, and humiliating.
The initial identification came from Clearview AI, a facial recognition software used by the West Fargo Police Department. The technology flagged Lipps based on similarities between her face and surveillance images linked to a local fraud case. Fargo Police Chief Dave Zibolski admitted that the West Fargo system was “part of the issue” leading to her wrongful arrest. While additional investigative steps were taken, those measures failed to prevent her extradition and prolonged detention.
Complications continued when Lipps spent three months detained in Tennessee because the Cass County Sheriff’s Office did not inform North Dakota authorities that she had signed an extradition waiver. By the time she arrived in Fargo, Lipps had already suffered significant personal and financial consequences. She lost her rental property, her storage unit belongings were seized due to unpaid bills, and her reputation had been severely damaged. In a GoFundMe campaign, she described the experience as “terrifying and exhausting and humiliating.”
Once in Fargo, Lipps’ lawyer quickly obtained bank records proving she had been in Tennessee at the time of the alleged fraud. The evidence dismantled the case in minutes. On December 23, just over five months after her initial arrest, a Fargo detective, the state’s attorney, and a judge agreed to dismiss the charges without prejudice to allow further investigation. Lipps was released on Christmas Eve.
In the aftermath, Zibolski stated that the Fargo Police Department would no longer rely on information from West Fargo’s Clearview AI system. He also confirmed that all facial recognition identifications would now be reviewed monthly by the department’s Investigation Division commander to maintain oversight. He acknowledged that proper procedure should have involved submitting surveillance images to agencies trained in facial recognition before any arrests were made.
Lipps’ experience exposes the risks inherent in AI-driven law enforcement tools. Her case highlights the need for careful oversight, transparency, and accountability when technology directly affects individual liberty. Five months in custody, along with the loss of property and public humiliation, underscore the human cost of misapplied AI.
Her story raises urgent questions about how law enforcement agencies deploy facial recognition, how results are verified, and how quickly human checks must intervene to prevent innocent people from enduring unnecessary legal and personal consequences. For Angela Lipps, the ordeal has left lasting marks, a stark reminder of the limits of AI in policing and the importance of procedural safeguards.


