Still, after several days, Pinellas County, Fla., Sheriff’s Office deputies were no closer to identifying him. Even a flier of the suspect’s photos sent to other law enforcement agencies turned up nothing.
Tampa police, meanwhile, were testing a powerful facial recognition technology that scanned for matches from billions of public images.
Pinellas asked for help and an analyst quickly got a match. Hours later, investigators arrested the suspect.
That’s when Pinellas decided that it needed to try this technology.
It isn’t alone. The Tampa Bay Times found about a dozen Florida law enforcement agencies have tried out or bought access to the facial recognition system from a little-known company called Clearview AI.
Happy customers say it’s a tool to make the public safer. But some experts and privacy advocates warn that Clearview’s system poses a significant threat to civil liberties.
“We sometimes talk about the slippery slope of government incursions on people’s rights,” said Nathan Wessler, a staff attorney at the American Civil Liberties Union who focuses on surveillance and privacy. “The prospect of this kind of facial recognition in government hands is the bottom of that slope -- it’s the place we don’t want to get to.”
A flurry of media reports, including a January New York Times article, have raised questions about the company’s methods and claims. Social media giants like Twitter, Facebook, YouTube and Venmo have told the company to stop pulling information from their sites. New Jersey’s attorney general has said police in his state should stop using Clearview.
Chief among the concerns about Clearview’s technology is its potential to identify virtually anyone at any time.
Law enforcement officers, using the app on a mobile device, could ID anyone on the street, privacy experts warn, deterring political rallies or even people going about their daily lives.
“It opens up the potential for our entire lives being scrutinized by the government for every reason or no reason at all,” Wessler said.
Some Florida police agencies say the technology’s potential to assist criminal investigations outweighs privacy concerns.
“Do you want to pick the ‘Big Brother’s watching you’ side or the side where you want to catch the bad guy?” said Lt. Mark Leone, spokesman for the Davie Police Department, which is buying access to Clearview’s system. “We pick that we want to catch the bad guy.”
Clearview’s algorithm has unparalleled matching capabilities, according to a description the company sent to agencies. Unlike some other facial recognition tools used by law enforcement, it isn’t limited by the angle of the face, Clearview said.
But perhaps one of the biggest differences of Clearview is its images database.
Typically, facial recognition software is sold to police so they can scan existing law enforcement-controlled databases of photos. That includes mugshots, driver’s licenses or sex offender registries, said Clare Garvie, senior associate at Georgetown University Law Center’s Center on Privacy and Technology.
Clearview, in contrast, said it searches at least 2 billion public photos scraped from commercial sites such as Facebook, Twitter and Venmo.
“Clearview’s speed and accuracy are unsurpassed. But the true ‘secret sauce’ is data,” Clearview wrote in a marketing document it sent to the Clearwater Police Department and obtained by the Times through a public records request.
It touts successes in helping with theft, bank fraud, child exploitation and other cases.
“What Clearview is (doing) is taking facial recognition and putting it on steroids,” Pinellas County Sheriff Bob Gualtieri said.
He said his agency is weighing whether to purchase licenses for the technology, noting that Clearview’s massive database offers “tremendous” power to match images to faces.
Clearview declined an interview request from the Times but did answer some questions via email. Hoan Ton-That, chief executive of the New York City-based company, stressed that his technology only takes images from public websites and not from anything that has been privacy protected. He said the technology is meant to help identify perpetrators and victims of crimes from materials that have already been obtained, not to identify random members of the public.
Ton-That did not respond to questions about whether Clearview had granted anyone outside of law enforcement access to its database. He also didn’t respond to a question about whether children’s faces have been captured.
The Times identified at least 13 Florida law enforcement agencies that have participated in a free trial of Clearview’s system. Of those, at least four signed contracts for the service. A fifth is about to.
The Gainesville Police Department signed a $10,000 contract in September for seven of its employees to use Clearview. Det. Sgt. Nick Ferrara said every search is recorded to limit misuse. He said police are supposed to verify any leads from using the technology.
Ferrara said he’s used other facial recognition programs, mentioning Vigilant Solutions, owned by Motorola, and the Face Analysis Comparison and Examination System, managed by the Pinellas County Sheriff's Office and used by law enforcement across Florida. That system, which Pinellas started building in the early 2000s, can access up to about 38 million images, including mugshots and Florida driver’s license and identification card photos.
But he said his tests of Clearview showed it was “clearly superior.”
He can now search for faces across the country. Ferrara said he’s made “numerous identifications of suspects” using the technology, mentioning shoplifting and financial fraud cases.
“I’m all about using tech to gain an advantage to catch bad guys,” Ferrara said.
The Broward County Sheriff’s Office paid $15,000 last year for 15 licenses, even though a spokesman said the agency is “still in the evaluation stage” of understanding Clearview’s capabilities.
The Volusia County Sheriff’s Office paid $10,000 for six licenses in January. It said the system has contributed to about 30 suspect leads.
On behalf of the Times, the Davie Police Department ran a reporter’s image through Clearview’s system.
In seconds, the agency called up more than 30 images of the reporter’s face, pulled largely from news websites and social media. Police could easily see the reporter’s name, as well as the names and faces of some of her friends and coworkers. The agency said it found photos from her wedding and bachelorette party.
While it seems impressive, the photo scanning hasn’t been extensively vetted, said Georgetown’s Garvie. The only verification of the algorithm’s accuracy comes from the company’s own audit.
With Clearview’s system, Garvie said, “there’s no guarantee that the identity associated with these photos is correct.” It’s possible, she said, for an innocent person to be arrested after being misidentified by the algorithm.
Because the database is so big, Garvie said, the odds are greater for lookalikes and misidentifications.
Clearview’s level of access is also problematic, Garvie said. Most facial recognition software uses law enforcement equipment to scan law enforcement databases. Because Clearview’s product is a subscription service, the company can see what law enforcement agencies are searching. This, experts say, raises all kinds of questions. The company could display incorrect or limited results for a search, tip off the subject of a search, or sell search queries to private entities.
Clearview said it provides the same database with the same results to all its law enforcement clients. It said its audit was done by independent experts and it would be open to doing other tests or audits “that make sense.”
Clearview said its services and trials are available only to verified members of law enforcement. But if Clearview ever decides to expand access to its data, Garvie said, it could be used by stalkers, criminals searching for witnesses in federal protection or those sussing out undercover officers.
Even if the technology works exactly as advertised, the ACLUs Wessler said, such a powerful tool allows for law enforcement or government to target specific populations and quash dissent.
While no Tampa Bay law enforcement agency has so far entered into a contract with Clearview, four have participated in trials. They are the Clearwater Police Department, Tampa Police Department, Pinellas County Sheriff’s Office and University of South Florida Police Department.
“I had the opportunity to meet multiple officers from around the country, and I spoke very highly of your system (I should be a salesman!)” Clearwater Police Department Sgt. Timothy Downes wrote in an October email to the company.
Clearwater’s police chief said his agency opted not to buy licenses with Clearview. Chief Dan Slaughter told the Times he was happy with the facial recognition system from Pinellas.
The Pasco County Sheriff’s Office said it has not tested Clearview, but an email the Times obtained from Capt. Mike Jenkins expressed interest in the product.
Several agencies around the state said they tested Clearview, but it was unclear how extensive the tests were and whether the technology was used in any specific cases.
Clearview's rise renewed questions about regulation of such technology. Florida does not have any laws limiting the use of biometric information. And law enforcement agencies using facial recognition historically have operated with little oversight or scrutiny.
“That means that there’s nothing explaining to police what they are and are not able to do,” said Wessler of the ACLU.
Though some agencies in Florida have policies or guidelines outlining how employees should use facial recognition technology, others haven’t adopted any rules.
Gualtieri, who is also president of the Florida Sheriffs Association, said a regulatory framework over facial recognition would be helpful, especially given how quickly the technology is emerging.
But he said bans on facial recognition technology -- like those in San Francisco and Oakland Calif., and Somerville, Mass. -- were an “overreach.”
Privacy advocates say it shouldn’t be up to police to decide.
“It should be up to the citizens who are subject to the use of this technology and the potential privacy and civil liberty risks to be included in the discussion of whether they want their police to have this type of tool,” Garvie said.
State Sen. Keith Perry, chair of the criminal justice committee, said he hasn’t seen much of a policy discussion about facial recognition technology in Florida. But he said that he’d like to “look at our role in regulating it” next legislative session.
Right now, he said, the concerns outweigh the benefits for him. Then he added: “I don’t know enough about it and how we can regulate it.”
©2020 the Tampa Bay Times (St. Petersburg, Fla.). Distributed by Tribune Content Agency, LLC.