Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Call to Ban Mass. Facial Recognition Grows Amid Protests

Activists and politicians in Massachusetts say the need to ban government use of facial recognition is dire, as protests about racial injustice sweep the nation. Many are concerned about its bias and lack of transparency.

(TNS) — As the Boston, Mass., City Council readies to vote on an ordinance banning the governmental use of facial recognition technology, activists and politicians Tuesday described the dire need to restrict the software, particularly in light of nationwide protests against racial injustice sparked by the killing of George Floyd in Minneapolis, Minn.

Concerns about the technology are aplenty. Activists, public officials and even some in law enforcement have noted the software is inaccurate, especially in identifying people of color, is already abused by totalitarian governments and may violate the public’s civil liberties and basic privacies.

At a press conference Tuesday ahead of a 3 p.m. Boston City Council hearing in which young residents were slated to speak about the dangers of facial recognition, speakers talked about the harm the technology poses to minorities, immigrants, students and other groups.

“As protests over police violence and systemic racism continue to shake the concrete here in Boston and across the country, the conversation we’re having today about face surveillance is all the more urgent,” said Kade Crockford, director of the Technology for Liberty program at the ACLU of Massachusetts, adding that citizens stand at a “crucial moment” in the nation’s history.

The American Civil Liberties Union of Massachusetts started a campaign last summer called “Press Pause Face Surveillance” with the hope of making citizens more aware of the worries posed by face surveillance and the need to pass a statewide moratorium.

A major concern about the technology is the fact that it remains largely unregulated at both the state and national levels. The ACLU has turned its attention to passing municipal bans on the software due to the lack of legislation governing law enforcement’s use of the technology on a wider scale.

The statewide moratorium is currently in the Massachusetts State House’s Joint Committee on the Judiciary, where the bill has received an extension until the end of the committee’s session, according to Crockford.

Although multiple public officials and technology experts testified against the use of the software in October and a strong majority of Massachusetts adults said in a 2019 poll they support a statewide moratorium, the bill may not make headway through the state legislature until July 31 or later, according to Crockford.

“We view these municipal efforts as crucial measures to, frankly, put pressure on the legislature,” Crockford told MassLive on Tuesday. “If the state legislature is not going to act, we have no choice but to work with municipal governments to protect our people, and that’s what we’re doing.”

So far, five communities in Massachusetts have passed either outright bans or temporary moratoriums on the municipal use of facial recognition. Those towns and cities include Brookline, Cambridge, Northampton, Somerville and, most recently, Springfield.

Easthampton may also be poised to passed its own municipal ban following talks between officials in the community’s government and the ACLU of Massachusetts, according to Easthampton City Councilor Peg Conniff.

“Now the campaign turns its attention to Boston, the region’s largest and most economically and politically important city,” said Crockford, noting that the community could be the largest city east of San Francisco to ban what she called a “dangerous, racially biased, dystopian technology.”

The California city was the first community in the country to ban the municipal use of facial recognition software.

Michelle Wu and Ricardo Arroyo, both city councilors in Boston, introduced their community’s ban on the government’s use of the technology in May, arguing the software is plagued by transparency and racial bias issues.

The ACLU of Massachusetts told the public last month that passing an ordinance restricting the technology is especially crucial in Boston, where the city’s contract with BriefCam, a company that runs the community’s surveillance camera network, was expected to expire on May 14.

The version of the network did not include facial recognition features, but if officials chose to renew the contract, the city would have been due for a “super-charge” update that could have included instant access to the surveillance tool, the ACLU said.

Wu told reporters at Tuesday’s press conference that she spoke with officials from the Boston Police Department and that, from her understanding, the update has not yet been incorporated.

“I don’t have 100 percent confirmation, but my understanding is, especially given this proposal was filed before that happened, that that has not been added,” the city councilor said. “But I’ll make sure to ask that at the hearing today.”

She added that Boston police have already agreed facial recognition surveillance is not appropriate for use. The agency claimed not to use it, the city councilor said.

“However, we know that the technology that they are already in contract with, the system that they have in place does have available a software upgrade that could add face surveillance to their current system without any public process,” Wu said. “We need to make sure that we’re codifying the protections against discrimination and protections of basic rights.”

Throughout Tuesday’s press conference, officials noted the racism engrained in the software and how the technology easily misidentifies people of color as well as transgender individuals.

Particularly during the coronavirus pandemic, a public health crisis that has disproportionately affected communities of color, it is important not to invest in technology that researchers have proven to be ineffective and that furthers racial inequity, Arroyo said in early May.

The city councilor told reporters on Tuesday that prior to serving on the Boston City Council, he was a public defender for the Massachusetts courts, where misidentification was commonly seen. Such inaccuracies are not new, he said.

“The inaccuracy of cross-racial identification, for instance, has been well litigated,” Arroyo said. “When we talk about facial recognition surveillance, it’s really important to understand that facial recognition tech. serves to further racial inequity.”

Massachusetts Institute of Technology researcher Joy Buolamwini, who is expected to speak before the Boston City Council later Tuesday afternoon, discovered “shocking and persistent” racial bias problems in facial recognition algorithms, Crockford said.

Arroyo noted that Buolamwini, the creator of MIT’s Algorithmic Justice League, found in her research that black women were 35 percent more likely than white men to be misclassified by face surveillance.

A December 2019 federal government study also confirmed racial bias remains a major issue for the technology, Crockford said, noting that the software does not impact everyone equally.

“But the technology is equally dangerous when it works exactly as advertised,” Crockford said. "In a free society, we should not be subject to constant government tracking and cataloguing of our every movement, habit and association. For at its logical conclusion, that is exactly the threat face surveillance poses to our individual freedoms and collective freedom.”

As Boston city councilors look to potentially ban facial recognition technology, they are also seeking information about the military equipment the city’s police department has previously applied and currently uses.

Last week, Wu filed an order requesting a “comprehensive inventory” of all the Boston Police Department’s assets. She called the request part of an effort to “demilitarize” the law enforcement agency.

Boston Mayor Marty Walsh also stated over the weekend he will look at potentially reallocating parts of the police department’s budget to training or “community involvement.”

“How we’ve been funding and operating our public safety infrastructure is not safe for so many residents. Our criminal justice system is not just for black and brown residents in our city and around the country," Wu said. “In this time of national trauma, we must act with urgency to protect communities and ensure accountability. That begins most concretely and most immediately at the local level.”

©2020 MassLive.com, Springfield, Mass. Distributed by Tribune Content Agency, LLC.