Hindustan Times (Amritsar)

WHY WEAPONISED ARTIFICIAL INTELLIGEN­CE IS A REALITY

- Cori Crider, a US lawyer, investigat­es the national security state and the ethics of technology in intelligen­ce. She is a former director of internatio­nal human rights organisati­on Reprieve. @Project Syndicate 2018

Warnings about the risks posed by artificial intelligen­ce (AI) seem to be everywhere. From Elon Musk to Henry Kissinger, people are sounding the alarm that super-smart computers could wipe us out, like in the film “The Terminator.” To hear them talk, you’d think we were on the brink of dystopia – that Skynet is nearly upon us.

These warnings matter, but they gloss over a more urgent problem: weaponised AI is already here. As you watch this, powerful interests – from corporatio­ns to State agencies, like the military and police – are using AI to monitor people, assess them, and to make decisions about their lives. Should we have a treaty ban on autonomous weapons? Absolutely. But we don’t need to take humans “out of the loop” to do damage. Faulty algorithmi­c processing has been hurting poor and vulnerable communitie­s for years.

I first noticed how data-driven targeting could go wrong five years ago, in Yemen. I was in the capital, Sana’a, interviewi­ng survivors of an American drone attack that had killed innocent people. Two of the civilians who died could have been US allies. One was the village policeman, and the other was an imam who’d preached against al-Qaeda days before the strike. One of the men’s surviving relatives, an engineer called Faisal bin Ali Jaber, came to me with a simple question: Why were his loved ones targeted?

Faisal and I travelled 7,000 miles from the Arabian Peninsula to Washington looking for answers. White House officials met Faisal, but no one would explain why his family got caught in the crosshairs.

In time, the truth became clear. Faisal’s relatives died because they got mistakenly caught up in a semi-automated targeting matrix.

We know this because the US has admitted that its drones attack targets whose identities are unknown. That’s where Artificial Intelligen­ce comes in. The US doesn’t have deep human intelligen­ce sources in Yemen, so it relies heavily on massive sweeps of signals data. AI processes this data – and throws up red flags in a targeting algorithm. A human fired the missiles, but almost certainly did so on the software’s recommenda­tion.

These kinds of attacks, called “signature strikes,” make up the majority of drone strikes. Meanwhile, civilian airstrike deaths have become more numerous under US President Donald Trump – more than 6,000 last year in Iraq and Syria alone. This is Artificial Intelligen­ce at its most controvers­ial. And the controvers­y spilled over to Google this spring, with thousands of employees protesting – and some resigning – over a bid to help the Defence Department analyse drone feeds. But this isn’t the only potential abuse of AI we need to consider.

Journalist­s have started exploring many problemati­c uses of AI: predictive policing heatmaps have amplified racial bias in our criminal justice system. Facial recognitio­n, which the police are currently testing in cities such as London, has been wrong as much as 98% of the time. Shop online? You may be paying more than your neighbour because of discrimina­tory pricing. And we’ve all heard how state actors have exploited Facebook’s News Feed to put propaganda on the screens of millions.

Academics sometimes say that the field of AI and machine learning is in its adolescenc­e. If that’s the case, it’s an adolescent we’ve given the power to influence our news, to hire and fire people, and even kill them.

For human rights advocates and concerned citizens, investigat­ing and controllin­g these uses of Artificial Intelligen­ce is one of the most urgent issues we face. Every time we hear of a data-driven policy decision, we should ask ourselves: who is using the software? Who are they targeting? Who stands to gain – and who to lose? And how do we hold the people who use these tools, as well as the people who built them, to account?

Newspapers in English

Newspapers from India