clock menu more-arrow no yes mobile

Filed under:

How AI tells Israel who to bomb

AI is supposed to help militaries make precise strikes. Is that the case in Gaza?

Rajaa Elidrissi is a researcher and producer on the Vox video team, where she works on Vox Atlas and other videos that focus on global issues.

Israel’s war with Hamas, in response to the attacks of October 7, 2023, has led to more fatalities than in any previous Israeli war, with at least 34,000 Palestinians killed as of May 7, 2024. In Israel’s 2014 war in Gaza, just over 1,400 were killed. One factor in that difference is the use of artificial intelligence.

Israel’s incorporation of AI in warfare has been public for years through both defensive and offensive weapons. But in this war, AI is being deployed differently: It’s generating bombing targets. The promise of AI in a military context is to enhance strike precision and accuracy, but over the past few months Israeli outlets +972 magazine and Local Call have revealed that the multiple AI systems that help the IDF select targets in Gaza have contributed to the highest number of Palestinian civilian deaths and injuries ever.

In our video, we interview multiple experts to understand how two specific systems, Gospel and Lavender, operate, and we explore the broader implications of current and future AI use in warfare.

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.