Israeli ‘AI Targeting System’ Has Caused Huge Civilian Casualty Count In Gaza: Report

The targeting program has reportedly been used in conjunction with another called “Where’s Daddy,” which tracked bombing targets to their family homes.
LOADINGERROR LOADING

An artificial-intelligence-fueled targeting system known as “Lavender” has been used for months by the Israeli military in Gaza to select bombing targets with minimal human oversight, according to a new report published Wednesday.

According to the report, which was published by the Israeli-Palestinian outlet +972 along with the website Local Call, military personnel approved the AI-selected targets with what one source said was often just a “rubber stamp,” taking only around “20 seconds” to review each target before approving a bombing.

The report, which was based on unnamed sources and documents, said such review was done simply to confirm a target was male — despite a study of a random sample identifying a 10% error rate in the program’s designations when it targeted people who were not militants. Despite that error rate, according to the report, sources said they received approval around two weeks into the current war to automatically adopt Lavender’s kill lists. What’s more, the military reportedly pursued targeted individuals at home, often with family present — the work of another program ominously called “Where’s Daddy?”

Sometimes, because of a lag in the “Where’s Daddy?” program, families were reportedly killed at home even when the main target was not present, the report said. It wasn’t clear from the report the extent to which the programs are still in use, though it said they were especially active in the early weeks of the war.

“At 5 a.m., [the air force] would come and bomb all the houses that we had marked,” one unnamed senior officer, referred to in the story as “B,” said. “We took out thousands of people. We didn’t go through them one by one — we put everything into automated systems, and as soon as one of [the marked individuals] was at home, he immediately became a target. We bombed him and his house.”

The outcome of the program, according to the story’s sources, was tens of thousands of targets, and thousands of Palestinian civilians killed in strikes, including women, children and men not involved in combat.

The report underscored global concerns at the rate of civilian casualties in Gaza, where some 33,000 people have died in Israel’s military campaign. The campaign began following the Hamas-led attack on Oct. 7, when Palestinian militants attacked Israel, killing around 1,200 people and taking 240 hostages. In return, Israel has pursued a monthslong bombing campaign and invasion.

It also follows international outrage over a series of Israeli strikes that killed seven food aid volunteers with World Central Kitchen Tuesday — which the leader of the aid group said he believed was intentional — and reports that Gaza is heading toward an unprecedented famine as access to food and basic necessities have been limited in part due to tight Israeli controls at border crossings. In a Thursday call with Israeli Prime Minister Benjamin Netanyahu, President Joe Biden called for an immediate temporary cease-fire, per a U.S. readout of the call. Secretary of State Antony Blinken also said the president “made clear the need for Israel to announce a series of specific, concrete, and measurable steps to address civilian harm, humanitarian suffering, and the safety of aid workers,” and that U.S. policy on Gaza would depend on those steps. Israel, in turn, dismissed two officers after an unusually quick investigation, and pledged to open more aid routes to Gaza.

Meanwhile, an Israeli siege of Gaza’s largest hospital, Al-Shifa, ended this week with reports of the destruction of the hospital, and reports of heavy casualties, including militants but also patients, doctors and hospital workers. (Israel denied any civilian casualties.) Israel has for weeks also publicly discussed a planned large-scale invasion of Rafah in Gaza’s south, where Israeli airstrikes have for weeks already taken a heavy toll on civilians despite American warnings to Israel.

A spokesperson for the Israeli military, in a statement to HuffPost about “the use of the AI-powered database named Lavender in the bombardment of Gaza,” disputed the assertion that the military used an artificial intelligence system to identify militants at all.

“Information systems are merely tools for analysts in the target identification process” subject to “independent examinations” to determine that targets meet “the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives,” the statement said. The statement said the Israeli military did not carry out strikes where collateral damage was judged to be “excessive in relation to the military advantage.” It concluded, “The IDF outright rejects the claim regarding any policy to kill tens of thousands of people in their homes.”

Separately, the military objected to several aspects of the report, for example, denying the existence of a “kill list” that has reportedly marked some 37,000 mostly junior suspected Hamas militants as part of the Lavender program.

Two sources told +972 and Local Call that the Israeli military judged that it was acceptable to kill up to 15 to 20 civilians for every junior Hamas operative targeted, and on occasion, more than 100 civilians for commanders. Another unnamed source, “A.,” who is an officer in a target operation room, told the publication that the army’s international law department had not previously given its approval for such extensive collateral damage. (By contrast, the article noted, General Peter Gersten, the American deputy commander for operations and intelligence in the operation to fight ISIS in Iraq and Syria, once said Osama bin Laden had what’s called a “Non-Combatant Casualty Cut-Off Value” of 30 civilian casualties.)

And one unnamed source said they had personally authorized hundreds of bombings of private homes belonging to alleged junior militants targeted by the AI program, often resulting in collateral damage of entire families. Multiple unnamed sources said the military routinely struck civilian households with no history of military activity. Now, per the report, the Israeli military has stopped generating lists of junior targets for bombing at home — due to a combination of American pressure, mass displacement of Gazans, and the destruction of Gaza’s housing stock.

A., the unnamed source, said that because supposed Hamas operatives often live in households with multiple women and children, “absurdly, it turns out that most of the people you killed were women and children.”

The shocking civilian casualties of the program come in part due to the use of unguided “dumb” bombs, rather than precision strikes, on junior militants targeted by the AI in the early weeks of the war, according to the report. As one unnamed intelligence officer said, “you don’t want to waste expensive bombs on unimportant people.” A U.S. intelligence assessment reported by CNN and The Washington Post in December found that nearly half of air-to-ground munitions used by Israel in Gaza since Oct. 7 were “dumb” bombs.

The author of Wednesday’s report is Yuval Abraham, an Israeli journalist and filmmaker known for his public call to end what he referred to as a system of “apartheid” in Israel and the Palestinian territories. In November, Abraham published a report on what an unnamed former intelligence officer told him was a “mass assassination factory,” a reference to AI-powered targeting decisions.

That November report also detailed Israeli bombing of so-called “power targets” including universities, banks and government offices, in what multiple sources said was an effort to exert “civil pressure” on Hamas.

However, “Lavender” is different from the AI-targeting tool discussed in the November report — known as “Habsora,” or “The Gospel” — because it tracks people rather than structures, Abraham reported. The program reportedly identifies targets by adding up various “features” supposedly indicating militant involvement with Hamas or Palestinian Islamic Jihad, including, in the report’s words, “being in a Whatsapp group with a known militant, changing cell phone every few months, and changing addresses frequently.” The report’s sources said almost every person in Gaza received a 1-to-100 rating expressing the likelihood that they were a militant.

In November last year, the United States said it was one of dozens of states to endorse a set of non-binding guidelines called the “Political Declaration on Responsible Military Use of Artificial Intelligence and Autonomy,” which among other things stresses obligations to international humanitarian law and states, “States should ensure that senior officials effectively and appropriately oversee the development and deployment of military AI capabilities with high-consequence applications, including, but not limited to, such weapon systems.”

In February, The Netherlands hosted a summit on military AI applications. Israel attended the summit, Reuters reported at the time, but as of Feb. 12 it had not endorsed the AI declaration, according to the U.S. State Department.

Popular in the Community

Close

What's Hot