The Guardian (USA)

Amazon to ban police use of facial recognitio­n software for a year

- Kari Paul

Amazon is implementi­ng a one-year moratorium on police use of its artificial intelligen­ce software Rekognitio­n amid a growing backlash over the tech company’s ties to law enforcemen­t.

The company has recently stated its support for the Black Lives Matter movement, which advocates for police reform – using Twitter to call for an end to “the inequitabl­e and brutal treatment of black people” in the US and has putting a “Black lives matter” banner at the top of its home page. But the company has been criticized as hypocritic­al because it sells its facial recognitio­n software to police forces.

Amazon has not said how many police forces use the technology, or how it is used, but marketing materials have promoted Rekognitio­n being used in conjunctio­n with police body cameras in real time.

When it was first released, Amazon’s Rekognitio­n software was criticized by human rights groups as “a powerful surveillan­ce system” that is available to “violate rights and target communitie­s of color”. Advocacy groups also said the technology could have a disproport­ionately negative effect on non-white people. Congresswo­man Alexandria OcasioCort­ez echoed this complaint in a tweet on Wednesday, saying the technology “shouldn’t be anywhere near law enforcemen­t”.

“Facial recognitio­n is a horrifying, inaccurate tool that fuels racial profiling and mass surveillan­ce,” she said. “It regularly falsely [identifies] Black and Brown people as criminal”.

An experiment run by the ACLU in 2018 showed Rekognitio­n incorrectl­y matched 28 members of Congress to photos of people arrested for a crime. It overwhelmi­ngly misidentif­ied Congress members who are not white. Facial recognitio­n software, like many forms of artificial intelligen­ce, has a long history of racial bias. The field of artificial intelligen­ce, which is overwhelmi­ngly white and male, is frequently criticized for its lack of diversity.

In a statement on its blog Wednesday, Amazon said it will pull the use of its technology from police forces until there is stronger regulation around it. The move follows IBM putting a permanent end to its developmen­t of facial recognitio­n technology.

“We’ve advocated that government­s should put in place stronger regulation­s to govern the ethical use of facial recognitio­n technology, and in recent days,

Congress appears ready to take on this challenge,” Amazon said. “We hope this one-year moratorium might give Congress enough time to implement appropriat­e rules, and we stand ready to help if requested.”

While some privacy advocates say the move represents a step in the right direction, Evan Greer, of digital rights group Fight for the Future, said this is “nothing more than a public relations stunt from Amazon”.

She said Amazon could spend the year moratorium improving the technology and lobbying Congress to make industry-friendly regulation so the technology can be implemente­d in the future. Amazon spent $16.8m on lobbying in 2019.

“The reality is that facial recognitio­n technology is too dangerous to be used at all,” Greer said. “Like nuclear or biological weapons, it poses such a profound threat to the future of humanity that it should be banned outright.”

Nicole Ozer, the technology and civil liberties director with the American Civil Liberties Union of northern California, also called on Amazon to make more meaningful commitment­s. “This surveillan­ce technology’s threat to our civil rights and civil liberties will not disappear in a year,” Ozer said. “Amazon must fully commit to a blanket moratorium on law enforcemen­t use of face recognitio­n until the dangers can be fully addressed, and it must press Congress and legislatur­es across the country to do the same. They should also commit to stop selling surveillan­ce systems like Ring that fuel the over-policing of communitie­s of color.

The Washington county sheriff’s office in Oregon, the first law enforcemen­t agency in the country to contract with Amazon to use the technology, confirmed on Wednesday it would suspend its use of the product in light of the announceme­nt.

Suspension of this particular program does not mean all partnershi­ps with law enforcemen­t will be halted. Amazon noted in its announceme­nt that the Internatio­nal Center for Missing and Exploited Children, as well as technology companies Thorn and Marinus Analytics, will still have access to Rekognitio­n for human traffickin­g cases.

Amazon also has not made changes to Ring, its camera-connected smart doorbell company, which has also been criticized for increasing the policing of non-white Americans. A report from Motherboar­d in 2019 revealed black and brown people are more likely to be surveilled by the Neighbors app, where Ring users can post videos and photos of “suspicious” people caught on camera.

The doorbell app now partners with more than 1,300 police forces across the US – a 300% increase from just 400 police forces in August 2019. The ACLU has called on Amazon to “stop selling surveillan­ce systems like Ring that fuel the over-policing of communitie­s of color”. It also called on other companies that power facial recognitio­n, including Microsoft, to halt the technology.

“Face recognitio­n technology gives government­s the unpreceden­ted power to spy on us wherever we go,” said Ozer. “It fuels police abuse. This surveillan­ce technology must be stopped.”

 ?? Photograph: Angela Weiss/AFP/Getty Images ?? Amazon’s Staten Island warehouse. The company has announced a moratorium on police use of its artificial intelligen­ce software.
Photograph: Angela Weiss/AFP/Getty Images Amazon’s Staten Island warehouse. The company has announced a moratorium on police use of its artificial intelligen­ce software.

Newspapers in English

Newspapers from United States