The Star Malaysia

Zionists’ genocidal ‘video game’ war

The israeli “soldiers” operating artificial intelligen­ce targeting programmes built very substantia­l collateral damage into their standard operating procedure.

- By JUAN COLE Dr Juan Cole is an American academic and commentato­r on the West relationsh­ip with the Middle East. Originally published on Common Dreams, which is a reader-supported independen­t news outlet.

THE incredibly brave and resourcefu­l Israeli journalist Yuval Abraham revealed on Apr 3 in a hard-hitting piece of investigat­ive journalism that the Israeli military has used two artificial intelligen­ce programmes, “Lavender” and “Where’s Daddy,” to target some 37,000 alleged members of the military wings of Hamas and Islamic Jihad.

In the +972 Magazine report, Abraham wrote that the programmes used GPS to discover when a Hamas member had gone home, since it was easiest to hit them there, ensuring that his wife and children would also be killed. If he lived in an apartment building, which most did, then all the civilians in neighbouri­ng apartments could also be killed–children, women, non-combatant men.

Science fiction writer Martha Wells has authored a series of novels and short stories about a “Murderbot,” an artificial intelligen­ce in the body of an armored warrior. Her Murderbot, despite being lethal, is a good guy, and in noir style frees himself from the control of his corporate overlords to protect his friends.

The Israeli army, in contrast, is acting much more roboticall­y.

Lavender is just a programme and doesn’t have a body attached, but uses Israeli fighter jet pilots as an extension of itself.

The AI programmes identified the Hamas militants according to vague specificat­ions. It is known to have a 10% error rate and in other cases the supposed militant might have only loose connection­s to the Qassam Brigades paramilita­ry or the IJ. There was, Abraham reported, almost no human supervisio­n over the working of the algorithm.

AI Lavender, at a 10% error rate, could have identified 3,700 men in Gaza as Hamas guerrillas when they weren’t. It could have allowed as many as 20 civilians to be killed in each strike on each of these innocents; that would give a total of 77,700 noncombata­nts blown arbitraril­y away by an inaccurate machine.

One of Abraham’s sources inside the Israeli army said, “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” A., an intelligen­ce officer, told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

I hope the Internatio­nal Court of Justice, which is considerin­g whether Israel is committing a genocide, is reading +972 Mag.

The AI programme included extremely loose rules of engagement on civilian casualties. It was set to permit 10-20 civilians to be killed as part of a strike on a low-level Hamas member, and up to 100 civilians could be killed to get at a senior member. These new rules of engagement are unpreceden­ted even in the brutal Israeli army.

The “Where’s Daddy” programme identified and tracked the members.

A full 37,000 Hamas paramilita­ry fighters did not carry out Oct 7. Most of them did not know about it beforehand. It was a tiny, tight clique that planned and executed it. The civilian wing of Hamas was the elected government of Gaza, and its security forces provided law and order (refugee camps are most often lawless). It may be that Lavender and “Where’s Daddy” swept up ordinary police in the definition of low-level Hamas fighters, which would explain a lot.

This new video game way of war violates the Rules of Engagement of the US military and all the precepts of Internatio­nal Humanitari­an Law.

The Marine Corps Rules of Engagement say:

c. Do not strike any of the following except in self defence to protect yourself, your unit, friendly forces, and designated persons or property under your control: – Civilians.

– Hospitals, mosques, churches, shrines, schools, museums, national monuments, and other historical and cultural sites.

d. Do not fire into civilian populated areas or buildings unless the enemy is using them for military purposes or if necessary for your self-defense. Minimize collateral damage.

e. Do not target enemy Infrastruc­ture (public works, commercial communicat­ions facilities, dams), Lines of Communicat­ion (roads, highways, tunnels, bridges, railways) and Economic Objects (commercial storage facilities, pipelines) unless necessary self-defense or if ordered by your commander. If you must fire on these objects to engage a hostile force, disable and disrupt but avoid destructio­n of these objects, if possible.

None of the Israeli “soldiers” operating Lavender were in danger from the civilians they killed. They made no effort to “minimise collateral damage.” In fact, they built very substantia­l collateral damage into their standard operating procedure.

If the Israeli military killed an average of 20 civilians each time they struck one of the 37,000 alleged militants, that would be 740,000 deaths, or three-quarters of a million. Of babies, toddlers, pregnant mothers, unarmed women, unarmed teenagers, etc., etc. That would be about a third of the total Gaza population.

That is certainly a genocide, however you wish to define the term.

And there is no way that Joe Biden and Antony Blinken haven’t known all this all along. It is on them. – Common Dreams

 ?? — reuters ?? AI victims: The damaged World Central Kitchen vehicle where employees from the humanitari­an aid group were killed in an israeli airstrike in rafah.
— reuters AI victims: The damaged World Central Kitchen vehicle where employees from the humanitari­an aid group were killed in an israeli airstrike in rafah.

Newspapers in English

Newspapers from Malaysia