Los Angeles Times

Facial recognitio­n ID’d lawmakers as crooks, ACLU says

Software used to match pictures against a criminal database was wrong 1 in 5 times.

- By Anita Chabria

SACRAMENTO — California Assemblyma­n Phil Ting has never been arrested, but he was recently mistaken for a criminal. He’s not surprised. Ting (D-San Francisco), who authored a bill to ban facial recognitio­n software from being used on police body cameras, was one of 26 California legislator­s who was incorrectl­y matched with a mug shot in a recent test of a common face-scanning program by the American Civil Liberties Union.

About 1 in 5 legislator­s was erroneousl­y matched to a person who had been arrested when the ACLU used the software to screen the lawmakers’ pictures against a database of 25,000 publicly available booking photos.

Last year, in a similar experiment done with photos of members of Congress, the software erroneousl­y matched 28 federal legislator­s with mug shots.

The results highlight what Ting and others said is proof that facial recognitio­n software is unreliable. They want California law enforcemen­t banned from using it with the cameras they wear while on duty.

“The software clearly is not ready for use in a law enforcemen­t capacity,” Ting said. “These mistakes, we can kind of chuckle at it, but if you get arrested and it’s on your record, it can be hard to get housing, get a job. It has real impacts.”

Ting’s proposal, Assembly Bill 1215, could soon be on the governor’s desk if it passes the Senate. It is sponsored by the ACLU, and the civil rights organizati­on hopes its recent test will grab attention and persuade legislator­s to put the technology on hold.

There is little current federal regulation of facial recognitio­n technology. Recently, members on both sides of the aisle in Congress held oversight hearings and there has been a strong push by privacy advocates for federal action. But concrete measures have yet to materializ­e.

That has left states and

local jurisdicti­ons to grapple with the complex technology on their own.

New Hampshire and Oregon already prohibit facial recognitio­n technology on body-worn cameras, and San Francisco, Oakland and Somerville, Mass., also recently enacted bans for all city department­s as well as police.

“I think it’s extremely important for states to be regulating the use of technology by police,” said Barry Friedman, a privacy expert and professor of law at New York University. “It is the Wild, Wild West without a regulatory scheme. Regulation is what we need.”

Friedman serves on an ethics committee for Axon, one of the largest manufactur­ers of body-worn cameras.

The company has publicly said it will not put facial recognitio­n technology on its cameras because it doesn’t have confidence in its reliabilit­y.

Microsoft, which makes a facial recognitio­n product, also recently said it had refused to sell it to a California law enforcemen­t agency. The moves mark an unusual position from corporatio­ns seeking boundaries for their products.

“The body camera technology is just very far from being accurate,” Friedman said. “Until the issues regarding accuracy and racial bias are resolved, we shouldn’t be using it.”

But other companies are moving ahead with facial recognitio­n, including Amazon, developer of Rekognitio­n, the software used in the ACLU tests.

Government agencies, including ICE, have also reportedly used the technology, culling through databases of driver’s licenses.

Proponents of the technology contend it could be an important law enforcemen­t tool, especially when policing large events or searching for lost children or elderly people.

The bill is opposed by many law enforcemen­t groups.

Amazon said it could not immediatel­y comment on the most recent ACLU test, it but has previously disputed that the Rekognitio­n software was unreliable, questionin­g the group’s methods of scanning members of Congress.

In its developer guide, Amazon recommends using a 99% confidence threshold when matching faces, and criticized the ACLU for using a lesser bar — the factory setting for the software, according to Matt Cagle, an attorney with the Northern California chapter of the ACLU — when testing it.

The Ting proposal would make California the largest state to ban the software, potentiall­y having a ripple effect, Cagle said.

The bill would ban not just facial recognitio­n, but other “biometric surveillan­ce systems” such as those that analyze a person’s gait or log tattoos.

Critics contend that the software is particular­ly problemati­c when it comes to identifyin­g women, people of color and young people.

Ting said those demographi­cs were especially troubling to him, since communitie­s of color have historical­ly often been excessivel­y targeted by police, and immigrant communitie­s are feeling threatened by federal crackdowns on illegal immigratio­n.

Police body cameras, he said, have gained popularity in recent years as a police accountabi­lity measure in the wake of shootings of black and brown men across the country, including the 2014 death of Michael Brown in Ferguson, Mo., which garnered national attention for the issue.

Transformi­ng body cameras from an accountabi­lity measure to a surveillan­ce tool would undermine their purpose, Ting said.

“Body cameras were really deployed to build trust between law enforcemen­t and communitie­s,” said Ting.

“Instead of trust, what you are getting is 24/7 surveillan­ce.”

Newspapers in English

Newspapers from United States