Daily Press (Sunday)

Too many flaws in commonweal­th’s facial recognitio­n law

- By Jameson Spivack Guest Columnist Jameson Spivack is an associate at the Center on Privacy & Technology at Georgetown Law. He researches law enforcemen­t’s use of algorithmi­c technologi­es, with a focus on facial recognitio­n technology.

In Virginia and beyond, there is bipartisan understand­ing that facial recognitio­n poses significan­t threats to privacy, civil rights and civil liberties. There is also bipartisan agreement that Virginia’s new facial recognitio­n regulation law does not protect against these threats.

Specifical­ly, there are three harms the new law does not address: extensive surveillan­ce, issues of accuracy and bias, and poor performanc­e as an investigat­ive tool.

Surveillan­ce

Facial recognitio­n gives police the power to identify and track people over time without their knowledge. This threat of surveillan­ce has chilling effects on people’s ability to exercise their First Amendment-protected right to free speech and assembly. It may also threaten their right against unreasonab­le search and seizure, as it allows police to track people’s movements over time without a search warrant.

Virginia’s new law is insufficie­nt because it gives police overly broad ability to use facial recognitio­n: on a wide range of people (not just suspects but potential witnesses and victims); and in a wide range of circumstan­ces (if they have “reasonable suspicion” to believe an individual has committed a crime). The result is that many people will continue to be tracked by police with no court oversight.

Accuracy and bias

While some of the top-performing facial recognitio­n algorithms have reduced issues with accuracy and bias, facial recognitio­n’s reliabilit­y remains questionab­le because of the errors that arise from human implementa­tion.

This law also does not ensure accurate, unbiased algorithms will be used. It sets a minimum accuracy threshold for “true positives” but doesn’t take into account “false positives” — falsely identifyin­g someone — which, in the case of policing, could lead to misidentif­ications and wrongful arrests. Nor does it specify this threshold must be reached on “one to many” matching algorithms, the kind used in law enforcemen­t investigat­ions. Additional­ly, the law requires that an algorithm show “minimal performanc­e variations across demographi­cs” but provides no further informatio­n, rendering the provision meaningles­s and effectivel­y allowing police to use a biased algorithm.

Nor does the law address the potential for cognitive biases to impact investigat­ing officers during the search process. It contains unsubstant­ial language about “training” officers, but because facial recognitio­n is so unregulate­d, there exists little to no accepted training in the first place, calling into question its efficacy as an investigat­ive technique.

Poor investigat­ive tool

There are no comprehens­ive studies evaluating how reliable facial recognitio­n is in forensic investigat­ions. Given humans’ cognitive biases and tendency to defer to recommenda­tions made by a machine, it is likely that, even as an investigat­ive technique, facial recognitio­n suffers from unacknowle­dged shortcomin­gs. On top of this, officers manipulate photos they run through facial recognitio­n, using photo editing software to change lighting and pixelation, adding facial features from other faces, or using celebrity lookalikes in place of photos of actual suspects. Police say this questionab­le use should be allowed because facial recognitio­n results only provide a lead for investigat­ors. But there are no recognized standards — from expert bodies, case law or statutes — about what additional evidence is needed beyond the results of a facial recognitio­n search to make an arrest. As a result, the technology has been used to form the sole or primary basis in multiple arrests, some of which were eventually ruled to be misidentif­ications.

This law does not provide any guidance about additional corroborat­ing evidence needed beyond a facial recognitio­n match. It provides no standards for the quality of photos that are used, which has a significan­t effect on accuracy. And it does not set any guidelines for photo pre-processing, meaning police are allowed to heavily manipulate the photos they run.

In short Virginia’s new law does not adequately address the risks facial recognitio­n poses, including the proliferat­ion of mass surveillan­ce, discrimina­tory impacts, and issues of reliabilit­y as an investigat­ive tool.

Newspapers in English

Newspapers from United States