Israel uses AI to expand Gaza targeting: report

Gaza

The Israel Defense Forces’ expanded authorization for bombing non-military targets, the loosening of constraints regarding expected civilian casualties, and the use of an artificial intelligence system to generate more potential targets than ever before, appear to have contributed to the destructive nature of the current war on the Gaza Strip, an investigation by progressive Israeli website +972 reveals. These factors, as described by current and former Israeli intelligence officials, have likely played a role in producing what has been one of the deadliest military campaigns against Palestinians since the Nakba of 1948.

The investigation by +972 and affiliated website Local Call is based on conversations with seven current and former members of Israel’s intelligence community—including military intelligence and air force personnel who were involved in operations in the besieged Strip—in addition to Palestinian testimonies, data and documentation from within Gaza, and official statements from the IDF and other Israeli state institutions.

Compared to previous Israeli assaults on Gaza, the current war—codenamed “Operation Iron Swords”—has seen the IDF significantly expand its bombing of targets that are not distinctly military in nature. These include private residences as well as public buildings, infrastructure, and high-rise blocks, which sources say the army defines as “power targets” (matarot otzem).

The bombing of “power targets,” according to intelligence sources who had first-hand experience with its application in Gaza in the past, is mainly intended to disrupt Palestinian civil society—to “create a shock” that, among other things, will “lead civilians to put pressure on Hamas,” as one source put it.

Several of the sources, who spoke on the condition of anonymity, confirmed that the IDF maintains files on the vast majority of potential targets in Gaza—including homes—which stipulate the number of civilians likely to be killed in an attack. This number is calculated in advance by the army’s intelligence units.

In one case discussed by the sources, the Israeli military command knowingly approved the killing of hundreds of Palestinian civilians in an attempt to assassinate a single top Hamas commander. “The numbers increased from dozens of civilian deaths [permitted] as collateral damage as part of an attack on a senior official in previous operations, to hundreds of civilian deaths as collateral damage,” said one source.

“Nothing happens by accident,” said another source. “When a three-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed—that it was a price worth paying in order to hit [another] target. We are not Hamas. These are not random rockets. Everything is intentional. We know exactly how much collateral damage there is in every home.”

According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (The Gospel), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”

The +972 report was released as accusations of war crimes and even genocide continue to mount against Israel.

Photo: Maan News Agency