Lavender: the Israeli Army’s powerful Artificial Intelligence system that suggests who to kill in Gaza

The device identifies possible Hamas members with highly questionable data, murdering their targets and those nearby.

Destruction in Gaza. Photo: Reuters.

He Israeli Army uses the most advanced military technology in the world, including weapons supported by Artificial Intelligence (AI). One of the most powerful is Lavender (lavender) which has the ability to recognize a target to bomb it. Since the war against Loop On October 7, 2023, until November 24, the device identified 37,000 Palestinians and killed 15,000 of them, according to Israeli media such as +972 Magazine and Local Call.

The system is designed to attacking victims in their homes and especially at night, so it can claim the deaths of those around it. In this sense, the automation of human military targeting It is a complete novelty in the war world and a false positive can mean the death of innocentsaccording to a note from the El País media.

Find more videos

However, since Israel Defense Forces (IDF) They rejected the idea that the machine defines “whether someone is a terrorist.” In this sense, they detailed that these technologies “They are mere tools for analysts in the objective identification process”although the cited sources – belonging to the Army and Israeli intelligence services such as Unit 8200 – affirm that the officers follow Lavender’s recommendations but without doing checks.

Some of the parameters to recognize a member of Hamas Or the Islamic Jihad They are related to monitoring whether an individual changes their phone frequently – something that happens very often in a context of conflict – or the fact that they are male, since women do not have official positions within the armed group.

Destruction of Al Shifa hospital, Gaza Strip. Photo: Reuters.

Like all systems that work according to statistics, Lavender has margins of error: at least 10% of his victims were not Hamas militants. If the same figures admitted by the Israeli Army are taken into account, each target indicated by the software implies numerous murders in a single bombing.

This system is combined with another that Israel has such as Where is Daddy? (Where is dad?, in English) that tracks marked individuals and bombards them upon arriving home, and with Gospel (The gospel) that identifies buildings where Hamas allegedly operates. In this way, Lavender processes information from the more than 2.3 million Gazans and gives a score from 1 to 100 for each one that depends on the minor or major probability that they are members of the armed group. Those who get the highest score are killed along with those around them.

Jeff Bezos' initiative shines a light of hope. Photo: EFE

It may interest you:

Jeff Bezos’ impressive millionaire contribution to AI project against climate change

Is it legal to use weapons like the Lavender in a war?

“The Israeli military uses AI to augment the decision-making processes of human operators. “This use is in accordance with international humanitarian law, as applied by modern Armed Forces in many asymmetric wars since September 11, 2001,” said jurist Magda Pacholska, researcher at the TMC Asser Institute and specialist in the combination of disruptive technologies. and military law.

In addition, the professional added that the Israeli Army used this AI system in Operation Guardian of the Walls in 2021. Other countries such as France, United States and Netherlands They also used it although against material structures. “The novelty is that, this time, it uses these systems against human targets,” the expert stressed.

Destruction of Al Shifa hospital, Gaza Strip. Photo: Reuters. Destruction of Al Shifa hospital, Gaza Strip. Photo: Reuters.

On the other hand, Arthur Holland Michel, in charge of reporting on the use of autonomous weapons in armed conflicts for the UN, stated that “what is different, and certainly unprecedented, in the case of Lavender in Gaza it is the scale and speed at which the system is being used. “The number of people that have been identified in just a few months is staggering.” “Another crucial difference is that The time between the algorithm’s identification of a target and the attack against it appears to have often been very short. That indicates that there is not much human research in the process. From a legal point of view, this could be problematic,” he assured.

Luis Arroyo Zapatero, honorary rector of the University of Castilla-La Mancha and specialist in International Criminal Law, stated that killing civilians and civil infrastructure in a war is a crime “against humanity.” “The deaths caused as collateral damage are pure murders. The Lavender system is straight up a civilian killing machinesince it admits collateral deaths of civilians of between 10 and 100 people beyond the precise objective,” added the expert.

Situation in Gaza due to the conflict between Israel and Hamas. Photo: REUTERS. Situation in Gaza due to the conflict between Israel and Hamas. Photo: Reuters.

In addition to bombing without discrimination, the Israeli Army has been monitoring the Palestinian population for some time, collecting all kinds of data. According to the Washington Post, there is a program –blue wolf– that It records the faces of West Bank residents, whether children, adults or the elderly, and associates them with a level of “dangerousness.” In this way, when soldiers see people on the street, they can observe a code that indicates whether to arrest them or not. A similar system is used in Gaza, according to The New York Times.

“Facial recognition everywhere, drones, spy technology…This State is really an incubator for surveillance technologies. If you sell a product, you have to show how effective it is in real scenarios and in real time. That’s what Israel is doing,” said Cody O’Rourke, of the NGO Good Shepherd Collective, from Beit Sahour, a Palestinian town east of Bethlehem. Furthermore, the American volunteer who has lived in Palestine for two decades said that he is on a blacklist which is why he is always delayed at Israeli military checkpoints.

“Israel has been delegitimizing the peace process with the Palestinians for decades, although it has never been interested in achieving peace. He needs the world to legitimize his occupation and uses technology to keep that occupation as a calling card.”, wrote Antony Loewenstein in his book The Palestinian Laboratory (Captain Swing, 2024).

 
For Latest Updates Follow us on Google News
 

-

PREV Ecuador: the unusual decision to suspend the working day for 2 days due to the energy crisis that the country is experiencing
NEXT Powerful trap for flies and ants