USA TODAY International Edition

Face-recognitio­n technology has heads spinning in Calif.

- Marco della Cava

SAN FRANCISCO – A routine traffic stop goes dangerousl­y awry when a police officer’s body camera uses its builtin facial recognitio­n software and misidentif­ies a motorist as a convicted felon.

Guns are drawn. Nerves fray. At best, lawsuits are launched. At worst, tragedy strikes.

That imaginary scenario is what some California lawmakers are trying to avoid by supporting Assembly Bill 1215, the Body Camera Accountabi­lity Act, which would ban the use of facial recognitio­n software in police body cams – a national first if it passes a Senate vote this summer and is signed by Gov. Gavin Newsom.

State law enforcemen­t officials here do not now employ the technology to scan those in the line of sight of officers. But some police officials oppose the bill on the grounds that a valuable tool could be lost.

The tug of war over high-tech sur

veillance methods comes after this tech hub’s City Council banned all forms of facial recognitio­n software last month. Oakland and Berkeley council members are considerin­g similar bans.

California’s AB 1215 reflects growing concerns nationwide about the darker side of tech – when the same software that allows iPhone X users to unlock their devices with a glance could wrongfully finger you as a criminal or keep tabs on you for Big Brother.

“There’s been an increased focus on privacy issues generally,” says Pam Greenberg of the National Council of State Legislatur­es.

Lawmakers in Massachuse­tts, New York and Washington are considerin­g bills that scrutinize and curtail the use of biometric and facial recognitio­n systems on grounds that the still-flawed technology presents an Orwellian threat to civil liberties. Congress also is weighing in. After hearings on the technology on May 22 and June 4, a bipartisan U.S. House Oversight and Reform Committee unanimousl­y agreed to push for a nationwide facial recognitio­n ban while more legal and regulatory guidance was sought.

“Even if this tech were to one day work flawlessly, do we want to live in a society where the government knows who you are, where you’re going, the expression on your face?” says Matt Cagle, tech and civil liberties attorney with the ACLU of Northern California.

“Consider also that the history of surveillan­ce is one of it being turned against the most vulnerable communitie­s,” Cagle adds. San Francisco City Supervisor Aaron Peskin, who was instrument­al in the city’s ban, says no possible benefit of facial recognitio­n systems “outweighs its demonstrab­le harms.”

‘Rolling surveillan­ce cameras’

Assembly member Phil Ting, D-San Francisco, sponsor of AB 1215, sees fundamenta­l freedoms being encroached if police use facial recognitio­n tech.

“If you turn on facial recognitio­n, you have rolling surveillan­ce cameras,” he says. “And I don’t think anyone in America wants to be watched 24/7.”

What’s more, AB 1215 supporters say facial recognitio­n would undermine the very reason body cams were introduced in the aftermath police shootings, which is to build trust with community members through accountabi­lity.

“Adding facial recognitio­n is a perversion of the purpose of body cams,” says Harlan Yu, executive director of Upturn, a Washington, D.C.-based advocacy group promoting justice in technology. “And it doesn’t help that this software often has a harder time differentiatin­g faces when it comes to people of color.”

While acknowledg­ing that the tech is still in its infancy, some police officials say a wholesale ban is premature.

“Facial recognitio­n could be a valuable tool for us, helping identify felons or even abducted children,” says Detective Lou Turriaga, director of the Los Angeles Police Protective League, which represents 10,000 officers.

“I understand trying to seek a balance between civil liberties and law enforcemen­t, but a wholesale ban doesn’t help us protect anybody,” he says. “Why remove that tool from law enforcemen­t? It just doesn’t make sense.”

Also opposed to AB 1215 as written is the California Police Chiefs Associatio­n, although the organizati­on declined to specify why. Careful use of facial recognitio­n is admirable, but it doesn’t prevent such accumulate­d data from being hacked by a third party, says the ACLU’s Cagle. “With just a few lines of code and a connection to those cameras, you could potentiall­y turn all that against the community,” he says.

‘Not ready for prime time’

Of greater concern to some privacy watchdogs is what Assembly member Ting calls the “not ready for prime time” nature of facial recognitio­n tech.

While an iPhone X’s camera works well scanning the same face for the same topographi­cal features in order to unlock the smartphone, that’s different than asking the technology to match a real face with a two-dimensiona­l photo.

Last year, the ACLU conducted an experiment in which it used Amazon’s Rekognitio­n software to compare photos of current members of Congress with 25,000 mugshots. The result was that 28 congressio­nal members were falsely flagged as criminals.

There are a growing number of examples that both laud and damn facial recognitio­n software. On the one hand, Google Photos once labeled two AfricanAme­ricans as “gorillas.” On the other, law enforcemen­t used the tech last year to identify the perpetrato­r in the shooting deaths of five at the Capital Gazette in Maryland, and in India, about 3,000 missing children were identified in four days using the software.

If there is one cautionary tale that surfaces in discussion­s of this technology, it is the case of China’s policing through an array of cameras equipped with facial recognitio­n software of the Uighurs, a largely Muslim minority in the western part of the country.

“This all is a lot bigger than police body cams. It’s cameras in buildings and on streets, in drones. We’re reaching a critical mass now and haven’t been paying attention,” says Brian Hofer, chairman of the City of Oakland’s privacy commission, which is pushing council members to adopt a ban.

“There’s something visceral about facial recognitio­n, something creepy,” he says. “We have seen the horrors of using the system to target a population, as in China, and yet we have this ridiculous belief surveillan­ce will be used in a friendly manner.”

Many privacy advocates note that Microsoft’s president, Brad Smith, recently cast doubt on the state of facial recognitio­n tech, which his company develops. The Redmond, Washington, company declined to sell its tech to an unnamed California police agency.

“We said, ‘This technology is not your answer,’” Smith said at a Stanford conference on artificial intelligen­ce in April.

And the accuracy of the technology gives many privacy advocates pause. They warn that if actions aren’t taken by lawmakers on a city, state or national level soon, it could be too late.

Feldman says humans always have a tendency to romanticiz­e the benevolent and magical powers of technology, at our own peril.

“The great computer in the sky is not foolproof,” she says. “I don’t think it’s possible to put the tech genie back in the bottle.”

 ??  ??

Newspapers in English

Newspapers from United States