USA TODAY International Edition
Face-recognition technology has heads spinning in Calif.
SAN FRANCISCO – A routine traffic stop goes dangerously awry when a police officer’s body camera uses its builtin facial recognition software and misidentifies a motorist as a convicted felon.
Guns are drawn. Nerves fray. At best, lawsuits are launched. At worst, tragedy strikes.
That imaginary scenario is what some California lawmakers are trying to avoid by supporting Assembly Bill 1215, the Body Camera Accountability Act, which would ban the use of facial recognition software in police body cams – a national first if it passes a Senate vote this summer and is signed by Gov. Gavin Newsom.
State law enforcement officials here do not now employ the technology to scan those in the line of sight of officers. But some police officials oppose the bill on the grounds that a valuable tool could be lost.
The tug of war over high-tech sur
veillance methods comes after this tech hub’s City Council banned all forms of facial recognition software last month. Oakland and Berkeley council members are considering similar bans.
California’s AB 1215 reflects growing concerns nationwide about the darker side of tech – when the same software that allows iPhone X users to unlock their devices with a glance could wrongfully finger you as a criminal or keep tabs on you for Big Brother.
“There’s been an increased focus on privacy issues generally,” says Pam Greenberg of the National Council of State Legislatures.
Lawmakers in Massachusetts, New York and Washington are considering bills that scrutinize and curtail the use of biometric and facial recognition systems on grounds that the still-flawed technology presents an Orwellian threat to civil liberties. Congress also is weighing in. After hearings on the technology on May 22 and June 4, a bipartisan U.S. House Oversight and Reform Committee unanimously agreed to push for a nationwide facial recognition ban while more legal and regulatory guidance was sought.
“Even if this tech were to one day work flawlessly, do we want to live in a society where the government knows who you are, where you’re going, the expression on your face?” says Matt Cagle, tech and civil liberties attorney with the ACLU of Northern California.
“Consider also that the history of surveillance is one of it being turned against the most vulnerable communities,” Cagle adds. San Francisco City Supervisor Aaron Peskin, who was instrumental in the city’s ban, says no possible benefit of facial recognition systems “outweighs its demonstrable harms.”
‘Rolling surveillance cameras’
Assembly member Phil Ting, D-San Francisco, sponsor of AB 1215, sees fundamental freedoms being encroached if police use facial recognition tech.
“If you turn on facial recognition, you have rolling surveillance cameras,” he says. “And I don’t think anyone in America wants to be watched 24/7.”
What’s more, AB 1215 supporters say facial recognition would undermine the very reason body cams were introduced in the aftermath police shootings, which is to build trust with community members through accountability.
“Adding facial recognition is a perversion of the purpose of body cams,” says Harlan Yu, executive director of Upturn, a Washington, D.C.-based advocacy group promoting justice in technology. “And it doesn’t help that this software often has a harder time differentiating faces when it comes to people of color.”
While acknowledging that the tech is still in its infancy, some police officials say a wholesale ban is premature.
“Facial recognition could be a valuable tool for us, helping identify felons or even abducted children,” says Detective Lou Turriaga, director of the Los Angeles Police Protective League, which represents 10,000 officers.
“I understand trying to seek a balance between civil liberties and law enforcement, but a wholesale ban doesn’t help us protect anybody,” he says. “Why remove that tool from law enforcement? It just doesn’t make sense.”
Also opposed to AB 1215 as written is the California Police Chiefs Association, although the organization declined to specify why. Careful use of facial recognition is admirable, but it doesn’t prevent such accumulated data from being hacked by a third party, says the ACLU’s Cagle. “With just a few lines of code and a connection to those cameras, you could potentially turn all that against the community,” he says.
‘Not ready for prime time’
Of greater concern to some privacy watchdogs is what Assembly member Ting calls the “not ready for prime time” nature of facial recognition tech.
While an iPhone X’s camera works well scanning the same face for the same topographical features in order to unlock the smartphone, that’s different than asking the technology to match a real face with a two-dimensional photo.
Last year, the ACLU conducted an experiment in which it used Amazon’s Rekognition software to compare photos of current members of Congress with 25,000 mugshots. The result was that 28 congressional members were falsely flagged as criminals.
There are a growing number of examples that both laud and damn facial recognition software. On the one hand, Google Photos once labeled two AfricanAmericans as “gorillas.” On the other, law enforcement used the tech last year to identify the perpetrator in the shooting deaths of five at the Capital Gazette in Maryland, and in India, about 3,000 missing children were identified in four days using the software.
If there is one cautionary tale that surfaces in discussions of this technology, it is the case of China’s policing through an array of cameras equipped with facial recognition software of the Uighurs, a largely Muslim minority in the western part of the country.
“This all is a lot bigger than police body cams. It’s cameras in buildings and on streets, in drones. We’re reaching a critical mass now and haven’t been paying attention,” says Brian Hofer, chairman of the City of Oakland’s privacy commission, which is pushing council members to adopt a ban.
“There’s something visceral about facial recognition, something creepy,” he says. “We have seen the horrors of using the system to target a population, as in China, and yet we have this ridiculous belief surveillance will be used in a friendly manner.”
Many privacy advocates note that Microsoft’s president, Brad Smith, recently cast doubt on the state of facial recognition tech, which his company develops. The Redmond, Washington, company declined to sell its tech to an unnamed California police agency.
“We said, ‘This technology is not your answer,’” Smith said at a Stanford conference on artificial intelligence in April.
And the accuracy of the technology gives many privacy advocates pause. They warn that if actions aren’t taken by lawmakers on a city, state or national level soon, it could be too late.
Feldman says humans always have a tendency to romanticize the benevolent and magical powers of technology, at our own peril.
“The great computer in the sky is not foolproof,” she says. “I don’t think it’s possible to put the tech genie back in the bottle.”