Search
Search

WAR

'Lavender,' the AI giving bombing orders in Israel's war on Gaza

An investigation also revealed the use of an AI system called "Where's Daddy?" to track targeted individuals and carry out bombings when they were inside their family residences.

'Lavender,' the AI giving bombing orders in Israel's war on Gaza

A Palestinian woman reacts as others rush to look for victims in the rubble of a building following an Israeli strike in Khan Younis in the southern Gaza Strip. (Credit: Mohamad Abed/AFP)

BEIRUT — A recent joint investigation by Israeli left-wing news and opinion online magazine +972 and a Hebrew-language news site Local Call exposes the existence of an AI-based program named "Lavender," developed by the Israeli army for identifying human targets.

Lavender, kept under wraps until now, is reported to have played a significant role in the targeting of Palestinians, particularly during the initial stages of the recent war in Gaza, despite making errors at least 10 percent of the time.

The investigation also found that the military authorized significant civilian casualties during targeted assassinations, including allowing the killing of up to 15 or 20 civilians for every junior Hamas operative marked by Lavender. 

Follow our live coverage of the war:

US approves thousands of bombs for Israel; Biden tells Netanyahu an 'immediate cease-fire is essential': Day 181 of the Gaza war

According to six former Israeli intelligence officers contacted by the two Israeli media outlets, and who were directly involved in the use of AI for warfare, Lavender is designed to identify potential targets for assassination within the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including individuals of low rank. During the early phase of the conflict, the army heavily relied on Lavender, marking around 37,000 Palestinians as suspected militants for potential airstrikes, along with their residential locations.

The investigation highlights that the military largely trusted Lavender's decisions, treating its outputs "as if it were a human decision." Human personnel often merely served as a "rubber stamp" for the machine's choices, spending minimal time verifying targets before authorizing bombings.

Despite knowing that Lavender makes errors in approximately 10 percent percent of cases, and sometimes identifies individuals with loose or no connections to militant groups, the army approved airstrikes on marked targets with little scrutiny. Notably, airstrikes predominantly targeted individuals in their homes, often at night when families were present, as it was deemed easier from an intelligence perspective.

The investigation reveals the use of another AI system called "Where's Daddy?" to track targeted individuals and carry out bombings when they were inside their family residences.

Read more.

Israel is using facial recognition in Gaza: NYT

The consequences of these actions were severe, with thousands of Palestinians, mostly non-combatants, killed by Israeli airstrikes in the conflict's early stages. Lavender employed more heavily at the start of the conflict, but is still being used as the war rages into its seventh month.

The military's preference for unguided missiles over precision bombs led to significant collateral damage, including the destruction of entire buildings and the deaths of civilians. The military authorized assassinations — guided by Lavender's "intelligence" — of senior Hamas officials that sometimes involved also killing more than 100 civilians in a single airstrike.

The investigation is structured around six chronological stages of the Israeli army's automated targeting process during the early weeks of the Gaza war which started on Oct.7. 

BEIRUT — A recent joint investigation by Israeli left-wing news and opinion online magazine +972 and a Hebrew-language news site Local Call exposes the existence of an AI-based program named "Lavender," developed by the Israeli army for identifying human targets. Lavender, kept under wraps until now, is reported to have played a significant role in the targeting of Palestinians, particularly...