The Guardian (USA)

‘The Gospel’: how Israel uses AI to select bombing targets in Gaza

- Harry Davies, Bethan McKernan and Dan Sabbagh in Jerusalem

Israel’s military has made no secret of the intensity of its bombardmen­t of the Gaza Strip. In the early days of the offensive, the head of its air force spoke of relentless, “around the clock” airstrikes. His forces, he said, were only striking military targets, but he added: “We are not being surgical.”

There has, however, been relatively little attention paid to the methods used by the Israel Defense Forces (IDF) to select targets in Gaza, and to the role artificial intelligen­ce has played in their bombing campaign.

As Israel resumes its offensive after a seven-day ceasefire, there are mounting concernsab­out the IDF’s targeting approach in a war against Hamas that, according to the health ministry in Hamas-run Gaza, has so far killed more than 15,000 people in the territory.

The IDF has long burnished its reputation for technical prowess and has previously made bold but unverifiab­le claims about harnessing new technology. After the 11-day war in Gaza in May 2021, officials said Israel had fought its “first AI war” using machine learning and advanced computing.

The latest Israel-Hamas war has provided an unpreceden­ted opportunit­y for the IDF to use such tools in a much wider theatre of operations and, in particular, to deploy an AI targetcrea­tion platform called “the Gospel”, which has significan­tly accelerate­d a lethal production line of targets that officials have compared to a “factory”.

The Guardian can reveal new details about the Gospel and its central role in Israel’s war in Gaza, using interviews with intelligen­ce sources andlittle-noticed statements made by the IDF and retired officials.

This article also draws on testimonie­s published by the Israeli-Palestinia­n publicatio­n +972 Magazine and the Hebrew-language outlet Local Call, which have interviewe­d several current and former sources in Israel’s intelligen­ce community who have knowledge of the Gospel platform.

Their comments offer a glimpse inside a secretive, AI-facilitate­d military intelligen­ce unit that is playing a significan­t role in Israel’s response to the Hamas massacre in southern Israel on 7 October.

The slowly emerging picture of how Israel’s military is harnessing AI comes against a backdrop of growing concerns about the risks posed to civilians as advanced militaries around the world expand the use of complex and opaque automated systems on the battlefiel­d.

“Other states are going to be watching and learning,” said a former White House security official familiar with the US military’s use of autonomous systems.

The Israel-Hamas war, they said, would be an “important moment if the IDF is using AI in a significan­t way to make targeting choices with life-anddeath consequenc­es”.

From 50 targets a year to 100 a day

In early November, the IDF said “more than 12,000” targets in Gaza had been identified by its target administra­tion division.

Describing the unit’s targeting process, an official said: “We work without compromise in defining who and what the enemy is. The operatives of Hamas are not immune – no matter where they hide.”

The activities of the division, formed in 2019 in the IDF’s intelligen­ce directorat­e, are classified.

However a short statement on the IDF website claimed it was using an AI-based system called Habsora (the Gospel, in English) in the war against Hamas to “produce targets at a fast pace”.

The IDF said that “through the rapid and automatic extraction of intelligen­ce”, the Gospel produced targeting recommenda­tions for its researcher­s “with the goal of a complete match between the recommenda­tion of the machine and the identifica­tion carried out

by a person”.

Multiple sources familiar with the IDF’s targeting processes confirmed the existence of the Gospel to +972/ Local Call, saying it had been used to produce automated recommenda­tions for attacking targets, such as the private homes of individual­s suspected of being Hamas or Islamic Jihad operatives.

In recent years, the target division has helped the IDF build a database of what sources said was between 30,000 and 40,000 suspected militants. Systems such as the Gospel, they said, had played a critical role in building lists of individual­s authorised to be assassinat­ed.

Aviv Kochavi, who served as the head of the IDF until January, has said the target division is “powered by AI capabiliti­es” and includes hundreds of officers and soldiers.

In an interview published before the war, he said it was “a machine that produces vast amounts of data more effectivel­y than any human, and translates it into targets for attack”.

According to Kochavi, “once this machine was activated” in Israel’s 11day war with Hamas in May 2021 it generated 100 targets a day. “To put that into perspectiv­e, in the past we would produce 50 targets in Gaza per year. Now, this machine produces 100 targets a single day, with 50% of them being attacked.”

Precisely what forms of data are ingested into the Gospel is not known. But experts said AI-based decision support systems for targeting would typically analyse large sets of informatio­n from a range of sources, such as drone footage, intercepte­d communicat­ions, surveillan­ce data and informatio­n drawn from monitoring the movements and behaviour patterns of individual­s and large groups.

The target division was created to address a chronic problem for the IDF: in earlier operations in Gaza, the air force repeatedly ran out of targets to strike. Since senior Hamas officials disappeare­d into tunnels at the start of any new offensive, sources said, systems such as the Gospel allowed the IDF to locate and attack a much larger pool of more junior operatives.

One official, who worked on targeting decisions in previous Gaza operations, said the IDF had not previously targeted the homes of junior Hamas members for bombings. They said they believed that had changed for the present conflict, with the houses of suspected Hamas operatives now targeted regardless of rank.

“That is a lot of houses,” the official told +972/Local Call. “Hamas members who don’t really mean anything live in homes across Gaza. So they mark the home and bomb the house and kill everyone there.”

Targets given ‘score’ for likely civilian death toll

In the IDF’s brief statement about its target division, a senior official said the unit “produces precise attacks on infrastruc­ture associated with Hamas while inflicting great damage to the enemy and minimal harm to noncombata­nts”.

The precision of strikes recommende­d by the “AI target bank” has been emphasised in multiple reports in Israeli media. The Yedioth Ahronoth daily newspaperr­eported that the unit “makes sure as far as possible there will be no harm to non-involved civilians”.

A former senior Israeli military source told the Guardian that operatives use a “very accurate” measuremen­t of the rate of civilians evacuating a building shortly before a strike. “We use an algorithm to evaluate how many civilians are remaining. It gives us a green, yellow, red, like a traffic signal.”

However, experts in AI and armed conflict who spoke to the Guardian said they were sceptical of assertions that AI-based systems reduced civilian harm by encouragin­g more accurate targeting.

A lawyer who advises government­s on AI and compliance with humanitari­an law said there was “little empirical evidence” to support such claims. Others pointed to the visible impact of the bombardmen­t.

“Look at the physical landscape of Gaza,” said Richard Moyes, a researcher who heads Article 36, a group that campaigns to reduce harm from weapons.

“We’re seeing the widespread flattening of an urban area with heavy explosive weapons, so to claim there’s precision and narrowness of force being exerted is not borne out by the facts.”

According to figures released by the IDF in November, during the first 35 days of the war Israel attacked 15,000 targets in Gaza, a figure that is considerab­ly higher than previous military operations in the densely populated coastal territory. By comparison, in the 2014 war, which lasted 51 days, the IDF struck between 5,000 and 6,000 targets.

Multiple sources told the Guardian and +972/Local Call that when a strike was authorised on the private homes of individual­s identified as Hamas or Islamic Jihad operatives, target researcher­s knew in advance the number of civilians expected to be killed.

Each target, they said, had a file containing a collateral damage score that stipulated how many civilians were likely to be killed in a strike.

One source who worked until 2021 on planning strikes for the IDF said “the decision to strike is taken by the onduty unit commander”, some of whom were “more trigger happy than others”.

The source said there had been occasions when “there was doubt about a target” and “we killed what I thought was a disproport­ionate amount of civilians”.

An Israeli military spokespers­on said: “In response to Hamas’ barbaric attacks, the IDF operates to dismantle Hamas military and administra­tive capabiliti­es. In stark contrast to Hamas’ intentiona­l attacks on Israeli men, women and children, the IDF follows internatio­nal law and takes feasible precaution­s to mitigate civilian harm.”

‘Mass assassinat­ion factory’

Sources familiar with how AI-based systems have been integrated into the IDF’s operations said such tools had significan­tly sped up the target creation process.

“We prepare the targets automatica­lly and work according to a checklist,” a source who previously worked in the target division told +972/Local Call. “It really is like a factory. We work quickly and there is no time to delve deep into the target. The view is that we are judged according to how many targets we manage to generate.”

A separate source told the publicatio­n the Gospel had allowed the IDF to run a “mass assassinat­ion factory” in which the “emphasis is on quantity and not on quality”. A human eye, they said, “will go over the targets before each attack, but it need not spend a lot of time on them”.

For some experts who research AI and internatio­nal humanitari­an law, an accelerati­on of this kind raises a number of concerns.

Dr Marta Bo, a researcher at the Stockholm Internatio­nal Peace Research Institute, said that even when “humans are in the loop” there is a risk they develop “automation bias” and “over-rely on systems which come to have too much influence over complex human decisions”.

Moyes, of Article 36, said that when relying on tools such as the Gospel, a commander “is handed a list of targets a computer has generated” and they “don’t necessaril­y know how the list has been created or have the ability to adequately interrogat­e and question the targeting recommenda­tions”.

“There is a danger,” he added, “that as humans come to rely on these systems they become cogs in a mechanised process and lose the ability to consider the risk of civilian harm in a meaningful way.”

 ?? Photograph: Atef Safadi/EPA ?? ‘To claim there’s precision and narrowness of force being exerted is not borne out by the facts,’ said one researcher.
Photograph: Atef Safadi/EPA ‘To claim there’s precision and narrowness of force being exerted is not borne out by the facts,’ said one researcher.
 ?? ?? Israeli soldiers during ground operations in the Gaza Strip. Photograph: IDF
Israeli soldiers during ground operations in the Gaza Strip. Photograph: IDF

Newspapers in English

Newspapers from United States