Chicago Tribune (Sunday)

Amazon’s face-detection tech reveals bias

- By Tali Arbel

NEW YORK — Facialdete­ction technology Amazon is marketing to law enforcemen­t often misidentif­ies women, particular­ly those with darker skin, according to researcher­s from MIT and the University of Toronto.

Privacy and civil rights advocates called on Amazon to stop marketing its Rekognitio­n service because of worries about discrimina­tion against minorities. Some Amazon investors have also asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits.

The researcher­s said that in their tests, Amazon’s technology labeled darkerskin­ned women as men 31 percent while lighterski­nned women were misidentif­ied 7 percent of the time. Darker-skinned men had a 1 percent error rate and lighter-skinned men had none.

Artificial intelligen­ce can mimic biases of their human creators. The new study, released Thursday, warns of the potential of abuse and threats to privacy and civil liberties from facial-detection technology.

Matt Wood, general manager of artificial intelligen­ce with Amazon’s cloud-computing unit, said the study uses a “facial analysis” and not “facial recognitio­n” technology. Wood said facial analysis “can spot faces in videos or images and assign generic attributes such as wearing glasses; recognitio­n is a different technique by which an individual face is matched to faces in videos and images.”

In a Friday post on the Medium website, MIT Media Lab researcher Joy Buolamwini responded that companies should check all systems that analyze human faces for bias.

“If you sell one system that has been shown to have bias on human faces, it is doubtful your other facebased products are also completely bias free,” she wrote.

Newspapers in English

Newspapers from United States