The National - News

Israel’s AI system used in air strikes criticised over analytical errors

- THOMAS HARDING

Technology experts have warned Israel’s military of potential “extreme bias error” in relying on Big Data for carrying out strikes in Gaza while using artificial intelligen­ce programmes.

The Israeli military has reportedly used two AI systems, Gospel and Lavender, to carry out air strikes on the enclave.

More than 35,000 people in Gaza have been killed since the start of the war, after Hamas launched an attack on southern Israel that killed about 1,200 people.

Big Data, defined as large, diverse sets of informatio­n that grow at ever-increasing rates, has now become so widespread and powerful with the rise of AI that “in the not too distant future, no one will be able to escape digital surveillan­ce”, Miah Hammond-Errey, director of emerging technology at the University of Sydney, told an online seminar for the Rusi think tank.

Israel’s use of powerful AI systems has allowed its military to enter territory for advanced warfare that had not previously been accessible. The Lavender system is understood to have processed huge amounts of personal data from Gaza, with about 37,000 Palestinia­n men linked by the system to Hamas or Palestinia­n Islamic Jihad.

It is also alleged that Israeli strike operators, using AI, are allowed to kill up to 20 civilians in each attack if the target is deemed an appropriat­e rank.

Unverified reports say the AI systems had “extreme bias error, both in the targeting data that’s being used, but then also in the kinetic action”, Ms Hammond-Errey told The National.

Such errors can occur when a device is calibrated incorrectl­y, so it miscalcula­tes measuremen­ts. Ms Hammond-Errey said broad data sets “that are highly personal and commercial” mean that armed forces “don’t actually have the capacity to verify” targets and that was potentiall­y “one contributi­ng factor to such large errors”.

She said it would take “a long time for us to really get access to this informatio­n”, if ever, “to assess some of the technical realities of the situation”, as the fighting in Gaza continues.

Prof Sir David Omand, former head of Britain’s GCHQ surveillan­ce centre, urged against “jumping to conclusion­s” over Israel’s AI use, as its military had not given independen­t access to its system.

“We just have to be a bit careful before assuming these almost supernatur­al powers to large data sets on what has been going on in Gaza, and just remember that human beings are setting the rules of engagement,” he said. “If things are going wrong, it’s because human beings have the wrong rules, not because the machines are malfunctio­ning.” Israeli’s use of Lavender and Gospel “would likely form a test case for how the internatio­nal community and tech companies respond to the use of AI”.

Ms Hammond-Errey said that for national security agencies, the “Big Data landscape offers the potential for this invasive targeting and surveillan­ce of individual­s”, not only by states but others not governed by rules.

Big Data could give armies “military dominance”, as it offers “imperfect global situationa­l awareness but on a scale previously not considered”, especially when connected to space targeting systems.

Aligned with AI, Big Data can compile “comprehens­ive profiles” of people, institutio­ns, political groups and nation states that “can be made remotely and very quickly”.

Ms Hammond-Errey warned that Big Data had been used around the world to target people and specific groups, exploiting “individual psychologi­cal weaknesses” as well as interferin­g with elections.

 ?? AFP ?? A man sweeps away rubble after a building was damaged by an Israeli strike in Rafah, southern Gaza,
AFP A man sweeps away rubble after a building was damaged by an Israeli strike in Rafah, southern Gaza,

Newspapers in English

Newspapers from United Arab Emirates