Israel’s AI system used in air strikes criticised over analytical errors
Technology experts have warned Israel’s military of potential “extreme bias error” in relying on Big Data for carrying out strikes in Gaza while using artificial intelligence programmes.
The Israeli military has reportedly used two AI systems, Gospel and Lavender, to carry out air strikes on the enclave.
More than 35,000 people in Gaza have been killed since the start of the war, after Hamas launched an attack on southern Israel that killed about 1,200 people.
Big Data, defined as large, diverse sets of information that grow at ever-increasing rates, has now become so widespread and powerful with the rise of AI that “in the not too distant future, no one will be able to escape digital surveillance”, Miah Hammond-Errey, director of emerging technology at the University of Sydney, told an online seminar for the Rusi think tank.
Israel’s use of powerful AI systems has allowed its military to enter territory for advanced warfare that had not previously been accessible. The Lavender system is understood to have processed huge amounts of personal data from Gaza, with about 37,000 Palestinian men linked by the system to Hamas or Palestinian Islamic Jihad.
It is also alleged that Israeli strike operators, using AI, are allowed to kill up to 20 civilians in each attack if the target is deemed an appropriate rank.
Unverified reports say the AI systems had “extreme bias error, both in the targeting data that’s being used, but then also in the kinetic action”, Ms Hammond-Errey told The National.
Such errors can occur when a device is calibrated incorrectly, so it miscalculates measurements. Ms Hammond-Errey said broad data sets “that are highly personal and commercial” mean that armed forces “don’t actually have the capacity to verify” targets and that was potentially “one contributing factor to such large errors”.
She said it would take “a long time for us to really get access to this information”, if ever, “to assess some of the technical realities of the situation”, as the fighting in Gaza continues.
Prof Sir David Omand, former head of Britain’s GCHQ surveillance centre, urged against “jumping to conclusions” over Israel’s AI use, as its military had not given independent access to its system.
“We just have to be a bit careful before assuming these almost supernatural powers to large data sets on what has been going on in Gaza, and just remember that human beings are setting the rules of engagement,” he said. “If things are going wrong, it’s because human beings have the wrong rules, not because the machines are malfunctioning.” Israeli’s use of Lavender and Gospel “would likely form a test case for how the international community and tech companies respond to the use of AI”.
Ms Hammond-Errey said that for national security agencies, the “Big Data landscape offers the potential for this invasive targeting and surveillance of individuals”, not only by states but others not governed by rules.
Big Data could give armies “military dominance”, as it offers “imperfect global situational awareness but on a scale previously not considered”, especially when connected to space targeting systems.
Aligned with AI, Big Data can compile “comprehensive profiles” of people, institutions, political groups and nation states that “can be made remotely and very quickly”.
Ms Hammond-Errey warned that Big Data had been used around the world to target people and specific groups, exploiting “individual psychological weaknesses” as well as interfering with elections.