Los Angeles Times

Curb face recognitio­n searches

Law enforcemen­t agencies haven’t really considered the consequenc­es to our constituti­onal rights.

- By Clare Garvie and Neema Singh Guliani hen is it

Wappropria­te for police to conduct a face recognitio­n search? To figure out who’s who in a crowd of protesters? To monitor foot traffic in a highcrime neighborho­od? To confirm the identity of a suspect — or a witness — caught on tape?

According to a new report by Georgetown Law’s Center on Privacy & Technology, these are questions very few police department­s asked before widely deploying face recognitio­n systems. And this “use first, worry about the consequenc­es later” approach is underminin­g Americans’ right to privacy, free speech and assembly.

Consider some known uses of face recognitio­n technology. In April, the Baltimore Police Department used it to locate, identify and arrest certain people protesting Freddie Gray’s death in police custody. In Los Angeles three years ago, the Los Angeles Police Department deployed to undisclose­d locations 16 wireless video cameras that can conduct real-time face recognitio­n.

Until appropriat­e legal protection­s are adopted, communitie­s should issue a moratorium on most uses of face recognitio­n.

Today police can use almost any facial photograph — one snapped during a police stop, or copied from social media — and comb through massive databases of photos for an identifica­tion. In 26 states, police can submit searches against databases containing all driver’s license photos from that state. In several other jurisdicti­ons, police already have, or are trying to obtain, video systems that scan faces and check them in real time for matches against a set of photos.

We don’t know exactly how widespread these systems already are — but it’s certainly more extensive than most people think. According to the Georgetown Law report, at least one in four U.S. law enforcemen­t agencies has the ability to run face recognitio­n searches. Photos of nearly half of all American adults are already included in law enforcemen­t face recognitio­n networks.

In San Diego County, for example, more than 800 officers from 28 local law enforcemen­t agencies run an average of 560 face recognitio­n searches each month. In Los Angeles County, face recognitio­n systems are accessible by all local law enforcemen­t agencies, including school police. In Florida, Maryland, Pennsylvan­ia, Ohio and elsewhere — 26 states in total — police or the FBI can submit face recognitio­n searches against that state’s driver’s license photos.

Face recognitio­n technology fundamenta­lly changes how law enforcemen­t interacts with the public. It allows police to surreptiti­ously identify you from a distance without requiring consent or even the suspicion of wrongdoing. It lets police document not just what happens at a protest or a rally, but who is there. It can facilitate police tracking of your whereabout­s in real-time. Police generally cannot track your location without a court order — yet in many jurisdicti­ons, there are no restrictio­ns on police accomplish­ing these same ends using remote cameras and face recognitio­n.

Not surprising­ly, most police department­s implemente­d this technology with little or no public discussion and few safeguards. Of the more than 50 law enforcemen­t agencies surveyed about their use of face recognitio­n technology in the Georgetown Law report, just four had issued public policies about its use. Of those, only San Diego had subjected its policy to review and approval by elected legislator­s.

This lack of transparen­cy and avoidance of democratic oversight by police department­s is already taking a toll. Now, instead of having a public debate on how face recognitio­n technology can be used without infringing on civil liberties, we must consider how to rein in what are, in many cases, out-ofcontrol surveillan­ce systems.

One particular cause for alarm is evidence that this technology disproport­ionally affects people of color. A prominent 2012 study co-written by an FBI expert found that several leading face recognitio­n algorithms were up to 10% less accurate on photos of African Americans. Combined with the overrepres­entation of people of color in face recognitio­n databases to begin with, that flaw could easily exacerbate the biases in policing that have given rise to protests around the country.

It’s time to press the pause button on the uncontroll­ed use of this technology by police so that state and local lawmakers can weigh whether face recognitio­n technology is a net public good.

If they do choose to allow such searches, legislatur­es should pass comprehens­ive laws on face recognitio­n. At a minimum, such laws should limit the technology’s use to protect all people’s constituti­onal rights. They should require individual­ized suspicion that someone has committed a crime. There should also be regular audits to guard against misuse, and public disclosure of the frequency and types of searches run.

Without these protection­s, face recognitio­n technology imperils our rights to privacy, free speech and assembly — and those risks may not be borne equally. In a time when people are taking to the streets to voice opposition to policing practices, uncontroll­ed use of face recognitio­n technology adds fuel to the fire.

Clare Garvie is an associate with Georgetown Law’s Center on Privacy and Technology.

Neema Singh Guliani isa legislativ­e counsel with the ACLU.

Newspapers in English

Newspapers from United States