It began with a search for the guy who grabbed $8,950 from a mobile phone repair shop on Bergenline Avenue in West New York, N.J., in 2019 and pistol-shipped the clerk who tried to stop him. All police had to go on in their effort to identify him was surveillance video.
The images from that video were not particularly good. The State Police initially attempted a facial recognition analysis — sophisticated software that uses biometrics to map a face and match it up against a database of mug shots and arrest photos in hopes of coming up with a name and identity. They came up with no matches. An investigator offered to re-run the inquiry if detectives could provide a better image.
Instead, detectives sent the video to the New York Police Department, which conducted a similar facial recognition analysis against its own wider database of imagery, ultimately offering up Arteaga as a “possible match.” The 46-year-old Queens man, who had served time in New York for assault and burglary, was soon arrested and Hudson County officials charged with robbery, aggravated assault and unlawful possession of a weapon.
Exactly how the NYPD’s software performed its magic, along with its accuracy in picking needles out of haystacks, is now a focal point of the defense in the Arteaga case.
Increasingly being deployed around the country, proponents call facial recognition an important crime-fighting and security tool. Federal investigators were able to use it to identify and later convict some of those accused in the aftermath of the Jan. 6th insurrection at the Capitol.
Critics, however, warn the growing use of the technology brings with it the danger that mistakes and flawed identification could put innocent people in prison. With some programs using photos scraped from social media accounts, they call it a threat to civil liberties.
“These technologies can be invasive and pervasive, potentially infringing on individuals’ privacy rights, freedom of association, and freedom of movement, as well as rights under the Fourth and Fourteenth Amendments,” said state Sen. Nia Gill, D- Essex, who sponsored legislation that would require a public hearing prior to the use of any facial recognition technology by any law enforcement agency.
In the Assembly, a separate bill would require testing of facial recognition systems by independent, third-party agencies.
Dillon Reisman, a staff attorney with the American Civil Liberties Union of New Jersey, said “inaccurate and biased” facial recognition technology has led police departments elsewhere to throw the wrong person in jail with no accountability. Noting that facial recognition systems frequently perform worse at identifying people with darker skin, he added that communities of color bear the risk of misidentification at a disproportionate rate.
The purpose of facial recognition, he observed, is to find people who look like someone in a picture.
“There are so many stories of people being misidentified,” he said.
Randal Reid was one of them. He spent nearly a week in jail last November after a facial recognition match by authorities in Louisiana led to the Georgia man being falsely accused of stealing purses in a state he said he had never visited.
In Paterson, Nijeer Parks said he was falsely accused in February 2019 through the use of facial recognition software of shoplifting from a Woodbridge hotel gift shop before fleeing, nearly running over an officer in the process. He spent 10 days in jail before the case was finally dismissed for lack of evidence.
Parks sued the police, the prosecutor and the City of Woodbridge for false arrest, false imprisonment and violation of his civil rights. The ongoing litigation, which was transferred two years ago to federal court, remains unresolved, said his attorney, Daniel Sexton of Jersey City.
“There should be transparency regarding this new and unproven technology,” he said. “In my case, they got a hit and dropped everything. Clearly they got the wrong guy. But they got the hit.”
Sexton said investigators have treated the technology like it was magic, but likened facial recognition to the early days of GPS — when the still-immature mapping systems were directing unwary drivers to drive off cliffs.
A spokesman for Woodbridge declined comment while the matter remains pending in court.
Gill said one of her biggest concerns over the technology is that an algorithmic bias that can exist in facial recognition technology disproportionately impacts people of color, leading to false identifications.
“This perpetuates systemic racial bias, leading to potential injustices in law enforcement and other sectors where such technology is used,” said the senator.
During a 2019 hearing before the House Oversight and Reform Committee, representatives of the ACLU noted they had tested one face recognition product and matched members of Congress photos against 25,000 mugshots. There were 28 false matches. Of those, 40 percent were members of color.
A spokeswoman for the Hudson County Prosecutor’s office declined to discuss the case against Arteaga. His public defender did not respond to requests for comment. But court filings in the matter detailed the path investigators took and the use of facial recognition technology that led to his arrest.
Buenavista Multiservices on Bergenline Avenue in West New York is a place to make international wire transfers, fix a broken cell phone or purchase phone accessories. Around 3:30 p.m. on a cold November afternoon in 2019, a man walked in and asked a woman behind the counter who was counting money about wiring funds to South America, according to one of the court briefs. As she turned toward her computer, he stepped around the counter and pointed a gun at her, grabbing the cash she had been handling. When she tried to stop him, police said he hit her with the gun, lacerating her left ear, before taking off on foot.
The store’s surveillance camera had captured images of the man, as did video security on the street. Copies of the images were sent to the State Police for a facial recognition analysis, but no matches within its database were found. Detectives then sent all the surveillance footage to the Facial Identification Section of the New York Police Department Real Time Crime Center, which matched the photos as a “possible match with Arteaga, whose mug shot was in the system after he served three years for assault from 2003 to 2006 and another three years for attempted burglary from 2015 to 2018.
The store clerk and its manager then picked Arteaga’s photo out of a grouping of other photos, identifying him as the assailant.
Attorneys for Arteaga sought to force prosecutors to reveal more details on the algorithms and facial recognition technology used by the New York Police Department in identifying his picture as a possible match with the surveillance video.
Without any information about the reliability of the facial recognition system, the public defender argued, “there can be no fair trial if the fruits of that technology’s use are admitted into evidence.”
When the trial judge in the case denied the request, the matter landed before the state appellate division which reversed the ruling.
“The reliability of the technology bears direct relevance to the quality and thoroughness of the broader criminal investigation, and whether the potential matches the software returned yielded any other viable alternative suspects to establish third-party guilt,” said the appellate panel, agreeing with the defense request that information related to the design, specifications, and operation of the program used for the facial identification that zeroed in on Arteaga were relevant.
They wrote that the defense “must have the tools to impeach the state’s case and sow reasonable doubt.”
The New York Police Department cited statistics that noted its Facial Identification Section in 2019 received 9,850 requests for comparison and identified 2,510 possible matches, including possible matches in 68 murders, 66 rapes, 277 felony assaults, 386 robberies, and 525 grand larcenies.
“The NYPD knows of no case in New York City in which a person was falsely arrested on the basis of a facial recognition match,” said the department.
The state Attorney General’s office, which has banned the use of one facial recognition program by New Jersey police departments because its database included millions of photos posted to social media, last year opened up a public comment period aimed at shaping statewide police on the use of the technology by law enforcement.
No additional action has yet been taken, said a spokesman for the office.
Arteaga, meanwhile, remains in jail, according to state court records, awaiting trial.
©2023 Advance Local Media LLC. Distributed by Tribune Content Agency, LLC.