Live News

[Blog] Lavender - the alleged AI system used by Israel in the Gaza war

On 5th April 2024 the UN Secretary General expressed his deep concerns over the alleged use of AI by Israel in its war in Gaza. In an article dated 11th April in CounterPunch, Binoy Kampmark pointed out that these concerns were also a coiling subject of worry for the digital community which feared that the use of AI in warfare could lead to "remorseless killing".

Publicité

But this is no longer a matter of wishful thinking. It has already become a reality and is no longer a fiction. It all started with a book entitled 'The Human Machine Team: How to create Human and Artificial Intelligence that will Revolutionise our World", written in 2021 by Brigadier General Y.S, Commander of the Israeli Intelligence Unit 8200. In this book, Y.S expounds his proposal for the creation of a "system capable of generating thousands of potential targets". This system would resolve the bottleneck caused when humans are involved in the identification of targets and the precious time that is lost in decision-making and approving of these targets. With AI, there would be no need for humans, not only to identify targets,  but also "to vet, check and verify" their viability.  All these would be done by the machine. 

Following an intensive interview carried out with six Israeli Intelligence Officers, +972 Magazine discovered that the Israeli army has been using this Ai system in its present war in Gaza. They have called it The Lavender. It is more sophisticated than another system used by them called Hasbora or The Gospel in the sense that the latter identifies buildings and structures whereas the former generates "kill lists". 

These are materialised from information gathered through a mass surveillance system operated by Israel on the 2.3 m people in Gaza. On the basis of such information such as "being in a WhatsApp group with a known militant, or changing mobile numbers every few months or changing addresses frequently", the Lavender gives a rating of 1 to 100 to every Gazan.

Thus some 37,000 Palestinians have been clocked as suspected militants to be the targets of air strikes. According to the Intelligence Officers, the army gave "sweeping approval" to Lavender's kill lists without a thorough check or examination of the raw Intelligence data. Officers became mere "rubber stamps".

The Lavender is supplemented by another system called " Where's Daddy". This system tracks the so-called suspects in 
their residence and tips off the Israeli army when the former reaches home. Then their houses are bombed, preferably at night, together with everybody in them, including women and children. One Intelligence Officer stated that rather than killing Hamas operatives or suspects in combat situations, this was the preferred option of the Israeli army: "the IDF bombed them in homes without hesitation, as a first option. It's much easier to bomb a family's home. The system is built to look for them in these situations". The result, therefore, is very often a massacre.

Once the target is identified and the tipoff is given by 'Where's Daddy', the Israeli human operatives have only 20 seconds to decide whether to go ahead or not. Invariably in most cases, the go-ahead is given. During this infinitesimally short period of time, they base their decision on criteria such as whether the target is a junior suspect or a senior Hamas operative. In the case of the former, some 15 to 20 dead civilians are acceptable whereas in the case of the latter, the level of acceptability goes up to 300 collateral victims. What is more callous is that most of the time 'dumb' bombs are used, that is, unguided missiles lacking in precision, because "precision bombs are expensive and in short supply". This often results in mass killings and flattening of entire blocks. 

According to the six Israeli Intelligence Officers interviewed by + 972 Magazine,  Lavender was extensively used during the first stages of the war without any supervisory mechanism, despite knowing that it has a 10% inherent margin of error on top of errors such as a Hamas target could have given his mobile to "his son, his brother, a neighbour or someone else". So, that person using the mobile would be bombed in his house with his family. Other errors relate to people having similar names or nicknames or in cases where former Hamas operatives have left the organization a long time back, but they still appear in the kill lists generated by Lavender.

Officially, the Israeli army has denied the existence of Lavender but + 972 Magazine whose Jewish editor, Yuval Abraham, has recently won the Panorama Audience Award at the last Berlinale international film festival, maintains that they have had first-hand information from officers serving in Gaza.

All this cannot but alarm activists of civil and political rights to the sprawling dangers of a mass surveillance system anywhere in the world. 

Azize Bankur

Sources: +972 Magazine, CounterPunch
 

fbcomm
 

Notre service WhatsApp. Vous êtes témoins d`un événement d`actualité ou d`une scène insolite? Envoyez-nous vos photos ou vidéos sur le 5 259 82 00 !