fbpx
Join us in championing courageous and independent journalism!
Support Daraj

Following the Assassinations of Shokr and Haniyeh, Questions Arise About Israel’s Precision in Targeting “Human” Targets

Published on 21.08.2024
Reading time: 7 minutes

This is not the first time Israel has employed artificial intelligence; it previously used these technologies during its war on Gaza in 2021 and in the ongoing conflict.


Israel’s response to the Majdal Shams incident, which it claimed was orchestrated by Hezbollah, was not surprising. However, the precision with which it identified and assassinated leaders of Hezbollah and Hamas raised eyebrows. There has been much speculation about a deep “penetration” of the security apparatus of Iran and its allies in the region, with talks of “agents” providing Israel with reports on the whereabouts and movements of these leaders. Since October 7, Israel has been promoting a new generation of armed attacks that rely on algorithmic targeting.

On the night of July 30-31, Israel targeted the Haret Hreik area in Beirut’s southern suburbs, Dahyeh, assassinating Fouad Shokr, a senior advisor to Hezbollah’s Secretary-General Sayyed Hassan Nasrallah, and an exclusive veterans’ facility belonging to the Revolutionary Guards in northern Tehran, where Ismail Haniyeh, the head of Hamas’s political bureau, was also assassinated.

In 1992, locating targets was far more complex due to the “miserable state of Israeli intelligence on Hezbollah,” as described by New York Times journalist Ronen Bergman in his book Rise and Kill First: The Secret History of Israel’s Targeted Assassinations.

Bergman details the difficulties Israeli intelligence faced in gathering information about the residence of Hezbollah’s former Secretary-General, Sayyed Abbas al-Musawi. Israeli intelligence began by tracking Lebanese press reports, consulted a Ph.D. thesis on Hezbollah, and then recruited agents to gather intelligence on al-Musawi, paving the way for his assassination.

This is not the first time Israel has employed artificial intelligence; it previously used these technologies during its war on Gaza in 2021 and in the ongoing conflict.

Israel operates what former Israeli army officers describe as a “mass assassination factory,” where children in Gaza are not seen as “collateral damage.” According to these officers, every victim is killed based on algorithms and calculations known to the military.

The Israeli secret intelligence service, the Mossad, also uses artificial intelligence to carry out assassinations in Iran, exploiting the country’s entanglement in international sanctions, which hinder its ability to counter such programs.

An Iranian security source told The Atlantic that “Israel used AI technology to track Ismail Haniyeh in preparation for his assassination, the same system it used in 2020 to assassinate a senior security official near Tehran.”

In October 2020, Israel assassinated Mohsen Fakhrizadeh, an Iranian Revolutionary Guard officer and nuclear scientist, with a bullet fired via satellite command. The New York Times reported, citing an intelligence official familiar with the plan at the time, that “Israel chose a special model of the Belgian-made FN MAG machine gun, equipped with an advanced robotic system and multiple cameras to assassinate the nuclear scientist, and this ‘killer robot,’ capable of firing 600 rounds per minute, was operated via satellite.”

Media reports indicate that Israel assassinated Ismail Haniyeh with a bomb hidden in his room for nearly two months.

Axios, citing sources, reported that the bomb was in a high-tech device using artificial intelligence, and it was detonated remotely by Mossad agents on Iranian soil.

Lebanon, due to its limited financial resources, cannot develop its technology, telecommunications, and cybersecurity capabilities, which exposes its citizens to data leaks and potentially being added to the list of AI-determined assassination targets.

Experts have revealed that Israel has infiltrated Lebanon’s telecommunications network and is using data such as voice recognition to identify targets.

Machines Decide Who Lives and Who Dies

Since the beginning of the October 7 attacks, Israel has revealed three target-generation systems: The Gospel for infrastructure targets, Lavender for individual human targets, and Where is Daddy?—a system designed to track and target suspected individuals when they are at home with their families.

In 2021, Yossi Sarel, the current commander of the Israeli intelligence unit 8200, argued in his book for the design of a machine capable of analyzing vast amounts of data to generate thousands of military targets. According to an investigative report cited by The Guardian, this machine is indeed operational under the name Lavender.

Since the October 7 attacks, Lavender has turned Israeli soldiers into mere “rubber stamps,” as they are required to spend just 20 seconds reviewing a target and confirming that it is “male and not female” before proceeding with the strike.

Unit 8200, part of Israeli intelligence, designed this system to identify individuals suspected of having ties to the military wings of Hamas and Islamic Jihad, including those with low ranks, as potential targets for airstrikes.

The program hits 90 percent of its targets and can discern whether individuals have weak ties to the targeted groups, like Hamas, or no connection at all. The system assigns almost every person in Gaza a rating from 1 to 100, reflecting the likelihood of that person being a combatant.

Lavender evaluates individuals based on a range of data, such as participation in a WhatsApp group with a known fighter, changing mobile phones every few months, photos, phone numbers, social media friendships, and frequent changes of address.

The system even adds individuals with communication patterns similar to those of Hamas or Islamic Jihad members—like police officers or civil defense personnel—to the assassination list. Consequently, civilians can be mistakenly targeted for assassination simply because they share a name or surname with a combatant or because they used a mobile device once owned by one.

The type of weapon used in assassination operations varies after Lavender identifies the targets, ranging from unguided “dumb bombs” to precision-guided munitions. Lavender classifies Hamas fighters as “minor” or “major,” and for each “minor fighter,” up to 15 or 20 civilians may be killed.

The Guardian cites an investigative report stating that the Israeli army allows for the killing of more than 100 civilians if the target is a senior Hamas official, such as a battalion or brigade commander. The report quotes an intelligence officer saying that the Israeli army “doesn’t waste expensive bombs on unimportant people—it’s too costly for the country, and there’s a shortage of such bombs.”

The Israeli army had previously announced the use of The Gospel (Gospel), an AI technology developed by Unit 8200, in warfare. This system can identify 100 targets for bombing daily, compared to 50 targets identified by the army annually.

The program relies on information and images captured by drones and uses data from surveillance towers to monitor the movements of targeted individuals and civilian casualty figures from bombings.

Israel often assassinates its adversaries’ leaders when they enter their homes rather than on the front lines. The intelligence unit designed Where is Daddy? to deliver lethal strikes to jihadists as soon as they enter their homes, sending notifications to intelligence officers as soon as a person enters their residence.

Israel has access to AI programs such as biometric surveillance, predictive policing systems, GPS-based targeting technologies, biometric sensing, and behavior analysis.

According to The New York Times, Israel uses biometric surveillance, a system that employs data from the Israeli company Corsight and Google Photos to identify faces in crowds, even from low-quality drone footage.

Time reported that the Israeli government signed the Nimbus agreement with global technology companies like Google and Amazon AWS, worth $1.2 billion. These companies, along with Microsoft Azure, have provided data storage and AI services to Israel since the October 7th attacks.

Corsight, an Israeli company, has a program called Fortify, an advanced facial recognition AI system that offers accuracy even in crowds and challenging environments like rain and operates almost entirely in the dark. It can recognize individuals even if their faces are partially covered or their images are of low quality, according to the company’s website. Corsight also has another program, API, which claims to be able to identify people even if they are wearing masks or personal protective equipment (PPE). In 2020, Corsight received $5 million in funding from Canada’s Os Ventures fund to help Israel identify its “human targets.” The company claims on its website that it is trusted by global companies like Apple Store and Google Play.

Who does Israel target? How? And how many will be victims? These are questions answered by the AI systems used by Unit 8200. These systems provide a list of “targets” and their “locations.” Regardless of Corsight’s claims, such as its programs’ compliance with the European Union’s General Data Protection Regulation (GDPR) standard No. 25, this does not guarantee that the right to privacy is protected and preserved. While EU regulations indicate the need for designers and controllers to implement measures to protect individuals’ data, such as data encryption, questions remain about what “encryption” means and whether AI programs are designed to identify the data owners in the first place.

Even though Israel has not signed on to EU regulations, its responsibility remains, given that most of those it monitors are civilian citizens, many of whom hold foreign, including European, nationalities.

In its war on Gaza, Israel has recorded a new breach of international law and human rights, not only through the use of AI in mass killings but also by intruding into the lives of civilians in the Middle East.