Israel used AI to identify 37,000 targets associated with Hamas
Israel's military used an undisclosed AI-powered database, known as Lavender, to identify 37,000 potential targets in the recent conflict in Gaza, according to +972 Magazine, a non-profit magazine run by a group of Palestinian and Israeli journalists. These targets were selected based on their suspected affiliation with Hamas, the magazine reported. The use of the AI system by the Israeli military has sparked legal and ethical debates about the relationship between military personnel and technology.
Lavender's target selection
An intelligence officer who operated the AI system Lavender praised its efficiency, stating they trusted a "statistical mechanism" more than a distressed soldier. Another Lavender user emphasized the reduced need for human involvement in the selection process, noting their role was merely to approve AI's decisions. These insights were shared by six Israeli intelligence officers who used AI system during the conflict to pinpoint Hamas and Palestinian Islamic Jihad (PIJ) targets.
Lavender's creation and role in the conflict
Lavender was developed by Unit 8200, a top-tier intelligence division within the Israel Defense Forces (IDF), comparable to America's National Security Agency (NSA) or Britain's Government Communications Headquarters (GCHQ). All six officers confirmed Lavender's crucial role in the conflict, as it processed vast amounts of data to quickly identify potential lower-ranking operatives. Some sources revealed that for certain target categories, the IDF factored in estimated civilian casualties before authorizing a strike.
IDF's response to Lavender's use in conflict
In response to the publication of these testimonies, the IDF stated that its operations adhered to international law's proportionality rules. They described Lavender as a tool used "to cross-reference intelligence sources" and provide current information on terrorist organizations' military operatives. Two officers alleged that during the initial weeks of the conflict, they were allowed to kill up to 20 civilians during airstrikes on low-ranking militants.
How Lavender identified human targets
Historically, identifying human targets in IDF operations was a labor-intensive task. However, as the IDF's bombardment of Gaza escalated, commanders required a steady flow of targets. To meet this need, Lavender was heavily utilized to create a database of individuals exhibiting characteristics of a PIJ or Hamas militant. One officer revealed that at its peak, the system managed to generate 37,000 people as potential human targets.
It achieved a 90% accuracy rate
Information regarding the particular types of data utilized in training Lavender's algorithm, and the methodology behind the program's decision-making process, remains undisclosed. Nevertheless, officers say that Unit 8200 reportedly honed Lavender's algorithm and adjusted its search criteria in the initial weeks of the conflict. Following random sampling and verification of its forecasts, the unit purportedly determined that Lavender had attained a 90% accuracy rate. This finding prompted the IDF to authorize its extensive utilization as a tool for target recommendation.