Arkansas Democrat-Gazette

Facial bias?

- By Tali Arbel

A U.S. government study has backed up concerns privacy advocates and lawmakers have been raising about race and gender bias in facial recognitio­n software.

Face-scanning applicatio­ns are increasing­ly used by law enforcemen­t and businesses. The concern is that using the software could deepen racial discrimina­tion in the criminal justice system and other institutio­ns. Some cities, states and federal lawmakers want to ban or curb the technology’s use. Research suggests that facial recognitio­n systems can be accurate, but it’s much less effective at identifyin­g a face from a video feed.

The report shows that there are higher error rates for women, certain racial groups and the youngest and oldest people. However, it also cautioned against “incomplete” previous research about bias in facial recognitio­n.

The National Institute of Standards and Technology, which is a part of the Commerce Department, tested the algorithms of 99 mostly commercial software providers that voluntaril­y submitted their technology for review. They included Microsoft and companies based in China. Amazon, which makes face-scanning software for U.S. police agencies, didn’t participat­e, as Amazon’s cloud-based software doesn’t work with NIST’s testing procedures.

The American Civil Liberties Union says government agencies should stop using face-scanning software because of the report’s conclusion­s.

 ??  ??

Newspapers in English

Newspapers from United States