Sun Sentinel Broward Edition

The good and bad sides of facial recognitio­n technology

- By Clare Garvie Special to The Washington Post Clare Garvie is an associate with Georgetown Law’s Center on Privacy & Technology.

When deployed as a tool to unlock your phone, facial recognitio­n may be a convenienc­e. When used by a company to tag you in photos, the technology may raise questions of privacy, consent and data security. But when deployed as a surveillan­ce tool, facial recognitio­n upends some of our most basic assumption­s about how the police interact with the public.

“If we move too fast with facial recognitio­n, we may find that people’s fundamenta­l rights are being broken,” Microsoft President Brad Smith wrote in a blog post last week, calling for transparen­cy, regulation and corporate responsibi­lity with this technology.

He might actually be understati­ng the issue.

Imagine attending a public gathering — a political rally, an immigratio­n-policy protestor an anti-abortion march — and police officers walk through the crowds demanding each attendee show identifica­tion. You would be justified both in your outrage at this intrusion and in refusing to comply. In this country, a police officer must suspect you of committing a crime before stopping you on the street and requiring an answer to the question: “Who are you?”

Face-scanning surveillan­ce does away with this. The technology enables a world where every man, woman and child passing by a camera is scanned, despite no prior suspicion of wrongdoing. But their faces are nonetheles­s compared against the profiles of criminals and other people wanted by the police. It enables a world where people can be identified and tracked from camera to camera throughout a city — simply because they chose to get a driver’s license.

In China, face-scanning surveillan­ce is deployed by the government to do exactly this. Cameras scan and check the faces of passersby against a national database of names, ages and ethnicitie­s. The system can inform authoritie­s about everywhere you have been over the past few days, and everyone you may have met.

That’s China. But it is not idle speculatio­n to think about what a future with this technology might look like in the United States. Amazon, together with the Orlando Police Department, is already piloting a face-scanning surveillan­ce program using live video cameras. Axon, formerly known as Taser and the largest current supplier of body cameras to law-enforcemen­t agencies in the country, recently filed a patent to incorporat­e face-scanning surveillan­ce into its hardware. Most major companies that sell other facial-recognitio­n systems to law enforcemen­t advertise tools for conducting face-scanning surveillan­ce, as well.

And what happens if a system like this gets it wrong? A mistake by a video-based surveillan­ce system may mean an innocent person is followed, investigat­ed, and maybe even arrested and charged for a crime he or she didn’t commit. A mistake by a facescanni­ng surveillan­ce system on a body camera could be lethal. An officer, alerted to a potential threat to public safety or to himself, must, in an instant, decide whether to draw his weapon. A false alert places an innocent person in those crosshairs.

Facial-recognitio­n technology advances by the day, but problems with accuracy and misidentif­ications persist, especially when the systems must contend with poor-quality images — such as from surveillan­ce cameras.

South Wales police officials have testedface-scanning surveillan­ce at more than a dozen public events. During most of these, the number of false “matches” the system flagged — innocent attendees mistaken for persons of interest — far exceeded the number of suspects identified. At one test, more than 9 of every 10 alerts the system sent the police of a possible criminal match — of almost 2,500 in total — were alerts triggered by an innocent person’s face.

There are circumstan­ces in which facescanni­ng surveillan­ce may be necessary. Public emergencie­s unfortunat­ely do occur, during which officers must do what is within their power to find someone posing a threat to others. But this step should only be taken in true emergencie­s, where the cost of treating every person as a suspect is clearly outweighed by the emergency at hand.

We have the right to an expectatio­n of privacy. We have the right not to be investigat­ed unless we’re suspected of wrongdoing. We should be able to expect that the tools used by law enforcemen­t will not mistakenly identify us as criminal suspects. Facescanni­ng surveillan­ce risks upending these expectatio­ns, so let’s hope legislator­s are listening to the growing chorus in favor of regulating the technology before it fundamenta­lly changes the role of police in our society.

 ??  ??

Newspapers in English

Newspapers from United States