Columbus, OH – Today, the American Civil Liberties Union, the ACLU of Ohio, and the National Association of Criminal Defense Lawyers filed a brief as amicus curiae to Ohio’s 8th District Court of Appeals in support of Appellee Qeyeon Tolbert in State v. Tolbert, a case addressing the improper hiding of police reliance on facial recognition technology when applying for a search warrant. 
 
Following a February 2024 homicide on a Cleveland street, Cleveland Police took video footage from a nearby convenience store that showed someone matching the description of the suspect and sent it to a state law enforcement intelligence group, which ran the image through an artificial intelligence powered facial recognition search using a company named Clearview AI. As police later testified, the facial recognition search produced “multiple photos of multiple people” as possible matches, including Mr. Tolbert. Authorities focused on Mr. Tolbert as the likely suspect. But when Cleveland Police applied for a search warrant to search his home, they concealed from the judge the fact that they were relying on face recognition technology, let alone that the technology is unreliable. The government subsequently charged Mr. Tolbert with murder after finding evidence in his house. The trial judge ultimately suppressed the evidence found in reliance on an invalid search warrant, but prosecutors appealed the ruling to the 8th District Court of Appeals. 
 
The amicus brief defends the trial judge’s ruling and argues that the search warrant was not based on probable cause because the product of this facial recognition technology cannot provide a reliable identification – a disclaimer which Clearview AI even includes its search results. The brief explains that face recognition technology is fundamentally unreliable because of well-known technical limitations, racially disparate false-match rates, and human operator errors. These sources of unreliability must be disclosed to judges when police apply for warrants. 
 
“In an age where artificial intelligence and facial recognition technology has become so pervasive and widespread, with limited oversight and regulation, it is important for courts to recognize the dangers and unreliability of this evidence in issuance of a search warrant,” added Amy Gilbert, Senior Staff Attorney for the ACLU of Ohio. “Face recognition technology grants police unprecedented and dangerous power because it doesn’t require the knowledge, consent, or participation of the individual and is often used in secretive ways without oversight.” 
 
“When police hide their use of fundamentally unreliable face recognition technology from judges, it undermines the ability of courts to ensure protection of our constitutional rights,” said Nathan Freed Wessler, Deputy Director of the ACLU Speech, Privacy, and Technology Project. “Face recognition technology often gets it wrong and is particularly error-prone when used to try to identify people of color. The appeals court has an important opportunity to affirm that when the government breaks the rules by keeping its use of this flawed technology secret, it should be held to account.” 

“This case exposes how law enforcement conceals dangerous and unreliable surveillance tools from public view,” said Sidney Thaxter of the NACDL's Fourth Amendment Center. “This technology has no place in the justice system or an open and free society.”

The legal organizations urge the 8th District Court of Appeals to uphold the trial judge’s decision.