Facial bias?
A U.S. government study has backed up concerns privacy advocates and lawmakers have been raising about race and gender bias in facial recognition software.
Face-scanning applications are increasingly used by law enforcement and businesses. The concern is that using the software could deepen racial discrimination in the criminal justice system and other institutions. Some cities, states and federal lawmakers want to ban or curb the technology’s use. Research suggests that facial recognition systems can be accurate, but it’s much less effective at identifying a face from a video feed.
The report shows that there are higher error rates for women, certain racial groups and the youngest and oldest people. However, it also cautioned against “incomplete” previous research about bias in facial recognition.
The National Institute of Standards and Technology, which is a part of the Commerce Department, tested the algorithms of 99 mostly commercial software providers that voluntarily submitted their technology for review. They included Microsoft and companies based in China. Amazon, which makes face-scanning software for U.S. police agencies, didn’t participate, as Amazon’s cloud-based software doesn’t work with NIST’s testing procedures.
The American Civil Liberties Union says government agencies should stop using face-scanning software because of the report’s conclusions.