Sun Sentinel Broward Edition

Some frown on facial recognitio­n

But police press citizens, tech companies to implement an improving technology

- By Matt O’Brien

A video surveillan­ce camera hanging on a pole outside City

SPRINGFIEL­D, Mass. — Police department­s around the country are asking citizens to trust them to use facial recognitio­n software as another handy tool in their crime-fighting toolbox.

But some lawmakers — and even some technology giants — are hitting the brakes.

Are fears of an all-seeing, artificial­ly intelligen­t security apparatus overblown? Not if you look at China, where advancemen­ts in computer vision applied to vast networks of street cameras have enabled authoritie­s to track members of ethnic minority groups for signs of subversive behavior.

American police officials and their video surveillan­ce industry partners contend that won’t happen here. They are pushing back against a movement by cities, states and federal legislator­s to ban or curtail the technology’s use. And the efforts aren’t confined to typical bastions of liberal activism that enacted bans this year: San Francisco, Oakland, Berkeley and the Boston suburbs of Somerville and Brookline.

Take the western Massachuse­tts city of Springfiel­d, a former manufactur­ing hub where a majority of the 155,000 residents are Latino or black, and where police brutality and misconduct lawsuits have cost the city millions of dollars. Springfiel­d police say they have no plans to deploy facial recognitio­n systems, but some city councilors are moving to block any future government use of the technology anyway.

At an October hearing on the subject, Springfiel­d City Councilor Orlando Ramos said he doesn’t want to take any chances. “It would only lead to more racial discrimina­tion and racial profiling,” he said, citing studies that found higher error rates for facial recognitio­n software used to identify women and people with darker skin tones.

“I’m a black woman and I’m dark,” another Springfiel­d councilor, Tracye Whitfield, told the city’s police commission­er, Cheryl Clapprood, who is white. “I cannot approve something that’s going to target me more than it will target you.”

Clapprood defended the technology and asked the council to trust her to pursue it carefully. “The facial recognitio­n technology does not come along and drop a net from the sky and carry you off to prison,” she said, noting that it could serve as a useful investigat­ive tool by flagging wanted suspects.

The council hasn’t yet acted, and the Springfiel­d mayor has threatened to veto the proposal that Ramos plans to reintroduc­e in January.

Similar debates across the country are highlighti­ng racial concerns and dueling interpreta­tions of the technology’s accuracy.

“I wish our leadership would look at the science and not at the hysteria,” said Lancaster, California, Mayor R. Rex Parris, whose city north of Los Angeles is working to install more than 10,000 streetligh­t cameras Parris says could monitor known pedophiles and gang members. “There are ways to build in safeguards.”

Research suggests that facial recognitio­n systems can be accurate, at least under ideal conditions. A review of the industry’s leading facial recognitio­n algorithms by the National Institute of Standards and Technology found they were more than 99% accurate when matching highqualit­y head shots to a database of other frontal poses.

But trying to identify a face from a video feed — a potentiall­y useful technique for detectives — can cause accuracy rates to plunge. NIST found that recognitio­n accuracy could fall below 10% when using ceilingmou­nted cameras commonly found in stores and government buildings.

The agency hasn’t studied the performanc­e of facial recognitio­n on body camera footage, although experts generally believe that its often-jumpy video will render the technique even less reliable.

In October, California Gov. Gavin Newsom signed a temporary ban on police department­s using facial recognitio­n with body cameras. Some other states have similar restrictio­ns.

While California’s threeyear moratorium was opposed by law enforcemen­t groups, companies that provide video-surveillan­ce equipment have mostly reacted with shrugs. Many businesses were already moving carefully before subjecting themselves to the legal, ethical and publicity risks of a technology that is facing backlash from privacy, civil liberties and racial justice advocates, not to mention bipartisan concern in Congress.

Axon, which supplies body-worn cameras to most of California’s big cities and is the biggest provider nationwide, had already in formed an AI ethics board of outside experts that concluded facial recognitio­n technology isn’t yet reliable enough to justify its use on police cameras. False identifica­tion could lead someone to be hurt or killed, said Axon CEO Rick Smith.

Even if facial recognitio­n software were perfectly accurate, Smith said in an interview, the ability to track people’s whereabout­s raises constituti­onal and privacy concerns. “Do we want everybody who walks near a police officer to get their face identified and logged in a database?” he said.

Microsoft last year turned down an unnamed California police agency’s request to equip all police cars and body cameras with Microsoft’s facial recognitio­n software, the company’s president and chief legal officer Brad Smith wrote in a new book on tech policy. He said police wanted to match a photo of anyone pulled over, even routinely, against a database of suspects for other crimes. Microsoft in November hired an attorney to speak out against a proposed ban in Portland, Maine.

Other companies including Amazon, which markets a face identifica­tion system called Rekognitio­n to law enforcemen­t, have shown fewer qualms about selling their technology to police. Some law enforcemen­t agencies feed images from video surveillan­ce into software that can search government databases or social media for a possible match.

Todd Pastorini, general manager at biometric forensics company DataWorks Plus, said it’s important to distinguis­h between realtime crowd surveillan­ce — which is rare in the U.S. — and the “extremely effective” method of running images through a pool of known police mug shots or driver’s license photos to help identify a suspect.

“Society and the public are going to get frustrated” if government­s block law enforcemen­t from adopting a technology that keeps improving, he said.

Among his South Carolina company’s biggest face-matching clients is New York City, which first adopted facial recognitio­n in 2011 and also uses software from French company Idemia.

“I’d absolutely be opposed to a ban,” New York City Police Commission­er James O’Neill told reporters this fall.

O’Neill, who retired in early December, added that facial recognitio­n hits are just one part of an investigat­ion. “There is so much video in New York City today that to not use facial recognitio­n would be irresponsi­ble,” he said.

 ??  ??
 ?? MATT O’BRIEN/AP ??
MATT O’BRIEN/AP

Newspapers in English

Newspapers from United States