Israel uses AI ‘Lavender’ to identify bombing targets in Gaza

Israeli army used Lavender to generate a list of 37,000 potential targets linked to Palestinian militant group Hamas.

Israel is using an artificial intelligence (AI) system called “Lavender” to identify Palestinians as potential targets for air strikes in the Gaza Strip, resulting in more than 33,000 Palestinian deaths since October 7, 2023.

According to a new investigation by +972 Magazine and Local Call, the Israeli army used Lavender to generate a list of 37,000 potential targets linked to the Palestinian militant group Hamas.

The system was developed through a joint investigation by the Israel Defense Forces’ elite intelligence division, Unit 8200.

MS Education Academy

The deployment of artificial intelligence aims to eliminate human involvement delays, thereby improving the army’s ability to identify targets and secure approval.

This is based on interviews with six unnamed Israeli intelligence officers who served during the war with Hamas in Gaza and were involved in the use of AI to investigate targets for killing.

Lavender’s targets were deemed questionable due to their low accuracy, but military officials approved the system after a sample showed a 90 percent accuracy rate.

“One source stated that human personnel often served only as a ‘rubber stamp’ for the machine’s decisions, adding that, normally, they would personally devote only about ‘20 seconds’ to each target before authorising a bombing.”

During the early stages of the war, the army did not need to investigate the AI tool’s target selection due to its 10 percent error rate.

“The Israeli army systematically attacked the targeted individuals while they were in their homes – usually at night while their whole families were present – rather than during military activity,” the investigation states.

One of the intelligence officers who served in the war told +972 that bombing a family home was easier than capturing suspected militants when they were away from civilians.

“The IDF bombed them [Hamas operatives] in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations,” the officer said.

The army frequently used “dumb bombs” to bomb suspected militants inside their households, causing collateral damage instead of precision-guided strikes.

The report attributed the unprecedented civilian death toll and the disproportionately high number of women and children killed due to these practices.

In a statement, the Israeli army denies using artificial intelligence to identify Hamas operatives, stating it uses a database to cross-reference intelligence sources and ensures targets meet international law and IDF directives, not predicting Hamas’ behaviour.

The investigation coincides with widespread international condemnation of Israel’s army campaign in Gaza, which has resulted in more than 33,000 Palestinian deaths and 75,668 wounded since October 7.

On Wednesday, April 3, Israeli air strikes killed seven foreign aid workers in Gaza, sparking global outrage in a targeted killing.

Back to top button