Pittsburgh Post-Gazette

Ban facial recognitio­n until legal safeguards are in place

-

Because very few laws or public policies guide the use of facial recognitio­n software, mass surveillan­ce using this technology endangers the right of privacy in general, and racial justice in particular. Pittsburgh should ban the use of facial recognitio­n by law enforcemen­t, at least until federal law eliminates the privacy risks, and the accuracy of the systems, across racial lines, can be assured.

The most famous facial recognitio­n company, Clearview AI, has a database of several billion images of people scraped from public records, social media accounts and so on. Over 3,000 American law enforcemen­t agencies at all levels use Clearview’s data and software.

We can’t avoid being on camera and having our images loaded into these databases. The proliferat­ion of private cameras in commercial districts and residentia­l neighborho­ods supplies them with even more images. And according to a University of Nevada study, over 1,000 police department­s deploy drones to monitor their communitie­s. If you leave your home, you will be on film, and can be identified.

This seems clearly to violate our right to privacy, as Americans understand it. People should be able to go to the grocery store, park, or library, or to meet with friends, without it being recorded.

But there are few rules about how these emerging technologi­es can and can’t be used. Case law interpreti­ng the limitation­s placed on facial recognitio­n by the Fourth Amendment is minimal. No federal privacy law defines what private companies can collect, and what they can do with it. That’s why Clearview considers the applicatio­ns of its technology across the public and private sectors to be “limitless.”

If you think it will only be deployed against “bad guys” who don’t look like you, you are mistaken. For example, nothing now stops private firms and public agencies from identifyin­g participan­ts in political events. Imagine the Justice Department compiling a list of people seen at Doug Mastriano rallies as “extremist” security threats, or a Gov. Mastriano directing the state police to do the same with people protesting his administra­tion. The possibilit­ies for abuse are, as Clearview so giddily indicates, “limitless.”

But even if this technology were deployed only for good and decent purposes, it still has a racial bias problem. According to a Brookings Institutio­n report, a 2018 test of commercial algorithms showed a tiny error rate when identifyin­g white men — but an over 20% error rate when applied to women with darker skin. While the software has improved, Clearview and others are still plagued by lower accuracy when identifyin­g people of color.

In other words, facial recognitio­n technology makes it more likely people of color will be identified, arrested and convicted for crimes they did not commit. Being falsely identified, even if later cleared, will be a damaging and traumatic experience.

Again, no federal or state law deals with this. That’s why Pittsburgh should apply limits of its own. Until appropriat­e safeguards are in place, and racial biases corrected — and maybe beyond — facial recognitio­n should not be allowed in Pittsburgh.

Newspapers in English

Newspapers from United States