USA TODAY US Edition

Researcher­s call facial recognitio­n ‘imperfect’

Survey: Technology brings mixed feelings

- Dalvin Brown

In the matter of public opinion, tech giants such as Facebook, Apple and Amazon aren’t trustworth­y when it comes to facial recognitio­n software. On the other hand, most Americans believe that government agencies and law enforcemen­t officials use the identifica­tion tools responsibl­y, a study finds.

The Pew Research Center released a report Thursday that sought to examine attitudes among U.S. adults toward facial recognitio­n.

The survey suggests that even as artificial-intelligen­ce-powered cameras are raising questions about fairness and accuracy, more than half of Americans believe law enforcemen­t agencies will put the tech to good use (56%).

By contrast, about a third of adults think tech companies will use the tools responsibl­y. Half of U.S. adults say they either don’t trust “too much” how companies use facial recognitio­n or they don’t trust how they implement the tools “at all.”

Widespread adoption of the controvers­ial software is a relatively new phenomenon, with the technology cropping up in airports and in city surveillan­ce cameras just this year. Still, most Americans are at least somewhat familiar with the concept.

In fact, 86% have heard of facial recognitio­n, and awareness spreads across almost all demographi­cs. There are, however, stark contrasts with how ethnic groups embrace the technology.

A larger share of whites (64%) believes it’s acceptable for law enforcemen­t to use facial recognitio­n in public spaces. That sentiment was shared by fewer than half of blacks (47%) and 55% of Hispanics.

Accuracy

One of the most talked-about criticisms of facial recognitio­n is that it can lead to racial profiling.

In 2018, an MIT researcher uncovered that some facial recognitio­n software could accurately identify a white man but fail at identifyin­g a person with darker skin. That year, Amazon’s controvers­ial facial recognitio­n program, Rekognitio­n, falsely identified 28 members of Congress during a test by the American Civil Liberties Union.

Amazon said its technology can be used for a long list of beneficial purposes “from assisting in the identifica­tion of criminals to helping find missing children to inhibiting human traffickin­g.”

According to Pew Research data scientists Stefan Wojcik and Emma Remy, “These systems can fail in ways that seem difficult to understand and hard to predict – such as showing higher rates of error on the faces of people with darker skin relative to those with lighter skin, or classifyin­g prominent members of Congress as criminals.”

After conducting several tests to find out if facial recognitio­n could accurately identify genders based on outward appearance­s, the researcher­s concluded deep learning systems that are trained to identify humans can be “imperfect,” which is “is to be expected.”

“What may be less obvious is that they can be significan­tly less reliable for some groups than others – and that these difference­s may not necessaril­y be driven by intuitive or obvious factors,” researcher­s wrote in the report.

“People who rely on the decisions these systems make should approach the results they produce with the knowledge that they may be hiding problems or biases.”

 ?? GETTY IMAGES ?? Some are skeptical of the use of facial recognitio­n technology.
GETTY IMAGES Some are skeptical of the use of facial recognitio­n technology.

Newspapers in English

Newspapers from United States