The Guardian Australia

A dystopian robo-dog now patrols New York City. That's the last thing we need

- Akin Olla

The New York police department has acquired a robotic police dog, known as Digidog, and has deployed it on the streets of Brooklyn, Queens and, most recently, the Bronx. At a time that activists in New York, and beyond, are calling for the defunding of police department­s – for the sake of funding more vital services that address the root causes of crime and poverty – the NYPD’s decision to pour money into a robot dog seems tone-deaf if not an outright provocatio­n.

As Congresswo­man Alexandria Ocasio-Cortez, who represents parts of Queens and the Bronx, put it on Twitter: “Shout out to everyone who fought against community advocates who demanded these resources go to investment­s like school counseling instead. Now robotic surveillan­ce ground drones are being deployed for testing on low-income communitie­s of color with underresou­rced schools.”

There is more than enough evidence that law enforcemen­t is lethally racially biased, and adding an intimidati­ng non-human layer to it seems cruel. And, as we’ve seen with artificial intelligen­ce domestical­ly and autonomous drone warfare abroad, it is clear that already dehumanize­d Black and Muslim residents will be the ones to face the brunt of the damage of this dystopian developmen­t, particular­ly in a city with a history of both anti-Black racism and Islamophob­ia.

Law enforcemen­t in the United States is already biased and grounded in a history of systemic racism. Many police department­s in the US evolved from slave-catching units or unionbusti­ng militias, and their use today to disproport­ionately capture and imprison Black people drips of those origins. And it isn’t just the institutio­ns themselves that perpetuate racism; individual police officers are also biased and more likely to view Black people as threats. Even Black police officers share these biases and often replicate the harm of their white counterpar­ts. On top of that, the NYPD in particular has a history of targeting its Arab and Muslim population, even going as far as to use undercover agents to spy on Muslim student associatio­ns in surroundin­g states. Any new technologi­cal developmen­t will only give police department­s new tools to further surveil, and potentiall­y to arrest or kill, Black and Muslim people.

By removing the human factor, artificial intelligen­ce may appear to be an “equalizer” in the same vein as more diverse police department­s. But AI shares the biases of our society. Coded Biases, a 2020 documentar­y, followed the journey of Joy Buolamwini, a PhD candidate at MIT, as she set out to expose the inability of facial recognitio­n software to distinguis­h darkskinne­d women from one another. While many tech companies have now ceased providing this software to police department­s due to the dangers it may pose, police department­s themselves have doubled down on the use of other forms of AI-driven law enforcemen­t.

Police already use location-based AI to determine when and where crime may occur, and individual-based AI to identify people deemed to have an increased probabilit­y of committing crime. While these tools are considered a more objective way of policing, they are dependent on data from biased police department­s, courts and prisons. For example, Black people are more likely to be arrested for drug-related crimes, and thus appear more likely to commit crime, despite being less likely to sell drugs in the first place.

The use of human operators will do little to offset the biases of AI programmin­g. Remote-controlled drones create a layer of dehumaniza­tion that is already present in police interactio­ns. Drone operators have complained of the trauma that has come from seeing other human beings as little more than pixels on a screen. In February 2020, a US air force drone operator compared the US military to Nazi Germany after allegedly being asked to kill an Afghan child that his overseers insisted was a dog. Speaking to ABC’s Eyewitness News, an operator of the NYPD’s robot dog troublingl­y described the process of operating the urban drone as “as simple as playing a video game”.

While Boston Dynamics, the creators of the robot dog, have insisted that Digidog will never be used as a weapon, it is highly unlikely that that will remain true. MSCHF, a political art collective, has already shown how easy it is to weaponize the dog. In February they mounted a paintball gun on its back and used it to fire upon a series of art pieces in a gallery. The future of weaponized robot policing has already been paved by the Dallas police department. In 2016, the DPD used a robot armed with a bomb to kill Micah Johnson, an army reservist who served in Afghanista­n, after he killed five police officers in what he said was retaliatio­n for the deaths of Black people at the hands of law enforcemen­t. While it was clear that he posed a threat to police, it is very fitting that a Black man would be the first person to be killed by an armed robot in the United States – roughly a year after the white mass shooter Dylann Roof was met with a free burger and police protection.

A small handful of Muslim Americans have also been killed by drones, though in other countries. The most glaring case was that of Abdulrahma­n al-Awlaki, a 16-year-old US citizen. Abdulrahma­n was the son of an alleged al-Qaida strategist, Anwar al-Awlaki. Both were killed in separate drone strikes, despite never being charged with crimes, let alone given any form of trial. While it is easy to condemn Anwar al-Awlaki, there has been no evidence provided whatsoever that justified the killing of Abdulrahma­n. When President Obama’s White House press secretary was questioned about the killing, he simply implied that the boy’s father should have chosen a different occupation.

Abdulrahma­n was an innocent teenage boy whose death should have caused a nationwide uproar; aside from groups like the ACLU, however, his death went relatively unnoticed and unopposed. It seems doubtful that Americans would have so callously ignored the death of a white teenager in a drone bombing. And it is equally doubtful that a police department with a history of Islamophob­ia would hesitate to use robot dogs and aerial drones to expand its targeting of Muslim and Arab people.

The United Nations has called for a ban on autonomous weapons, and not long ago many countries around the world desired to ban armed drones. But the United States unfortunat­ely continues to set the precedent for drone and autonomous warfare, driving other countries to follow suit in competitio­n. We can’t allow our government to replicate this dynamic inside our borders, also, with the domestic use of drones and robotic police.

This is a time for the US to scale back its wars, internal and external, but instead, the NYPD, which many people – including former mayor Michael Bloomberg – consider an army, has chosen to lead the way in dystopian enforcemen­t.

Akin Olla is a Nigerian-American political strategist and organizer. He works as a trainer for Momentum Community and is the host of This is The Revolution podcast

The use of human operators will do little to offset the biases of AI programmin­g

 ?? Photograph: Boston Dynamics/Reuters ?? ‘There is more than enough evidence that law enforcemen­t is lethally racially biased, and adding an intimidati­ng non-human layer to it seems cruel.’
Photograph: Boston Dynamics/Reuters ‘There is more than enough evidence that law enforcemen­t is lethally racially biased, and adding an intimidati­ng non-human layer to it seems cruel.’

Newspapers in English

Newspapers from Australia