We are witnessing the genocide of the Palestinians based on algorithms and machine learning; a system of apartheid in the Israeli-occupied West Bank and Gaza Strip reinforced by artificial intelligence; and surveillance and facial recognition systems of such prowess that Orwell’s 1984 regime would be green with envy. Today’s Israeli-occupied Palestine manifests a dystopian and totalitarian sci-fi movie script as far as the Palestinians are concerned. Moreover, the Zionists are fuelling this AI nightmare.
From the onset of its current war against the Palestinians in Gaza, the Zionist regime, blinded by revenge for 7 October, has leveraged AI in the most indiscriminate and barbaric way to kill tens of thousands of innocent Palestinian civilians. One such insidious AI tool that has dominated the headlines is The Gospel. Since last October, Israel has utilised this AI system to expedite the creation of Hamas targets. More specifically, The Gospel marks structures and buildings that the IDF claims Hamas “militants” operate from. This fast-paced target list, Israel’s disinclination to adhere to international humanitarian law, as well as US support emboldening Prime Minister Benjamin Netanyahu’s government, has led to a modern-day genocide.
The Gospel is used by Israel’s elite 8200 cyber and signals intelligence agency to analyse “communications, visuals and information from the internet and mobile networks to understand where people are,” a former Unit 8200 officer explained. The system was even active in 2021’s offensive, according to Israel’s ex-army chief of staff Aviv Kochavi: “…in Operation Guardian of the Walls [in 2021], from the moment this machine was activated, it generated 100 new targets every day… in the past [in Gaza] we would create 50 targets per year. And here the machine produced 100 targets in one day.”
Israel’s apathetic disposition towards civilians is evident as it has given the green light to the killing of many innocents in order to hit its AI-generated targets whether they be hospitals, schools, apartment buildings or other civilian infrastructure. For example, on 10 October last year, Israel’s air force bombed an apartment building killing 40 people, most of them women and children. Israel has also used dumb bombs that cause more collateral damage instead of guided munitions to target low-level Hamas leadership targets. “The emphasis is on damage and not on accuracy,” said the Israel Defence Forces’ own spokesperson. This is the primary reason why the death toll of civilians is so high, as is the number of those wounded. According to one study, the current war’s total number of 110,000 and growing Palestinian casualties (most of them civilians), is almost six times more than the Palestinian casualties in the previous five military offensives combined, which stands at 18,992.
“Lavender 3,” digital, Dream/ Dreamworld v. 3, 2024.
Two other AI programmes that have made the news recently are Lavender and Where’s Daddy? Lavender differs from The Gospel in that the former marks human beings as targets (creating a kill list), whereas the latter marks buildings and structures allegedly used by combatants. Israeli sources claim that in the first weeks of the war, Lavender was overly utilised and created a list of 37,000 Palestinians as “suspected militants” to be killed via air strikes. As previously mentioned, Israel’s apathy towards the Palestinians has been evident in their mass killing of civilians in order to eliminate even a single Hamas member. According to Israeli military sources, the IDF decided initially that for every junior Hamas member, 15 or 20 civilians could be killed.
This brutality is unprecedented.
If the Hamas member was a senior commander, the IDF on several occasions okayed the killing of over 100 civilians to fulfil its objective. For example, an Israeli officer said that on 2 December, in order to assassinate Wissam Farhat, the commander of Shuja’iya Battalion of the military wing of Hamas, the IDF knew that it would kill over 100 civilians and went ahead with the killing.
If this was not contentious enough, Lavender was also used without any significant checks and balances. On many occasions, the only human scrutiny carried out was to make sure that the person in question was not a female. Beyond this, Lavender-generated kill lists were trusted blindly.
However, the same sources explain that Lavender makes mistakes; apparently, it has a 10 per cent error rate. This implies that at times it tagged innocent people and/or individuals with loose connections to Hamas, but this was overlooked purposefully by Israel.
Moreover, Lavender was also programmed to be sweeping in its target creation. For example, one Israeli officer was perturbed by how loosely a Hamas operative was defined and that Lavender was trained on data from civil defence workers as well. Hence, such vague connections to Hamas were exploited by Israel and thousands were killed as a result. UN figures confirm the use of such a devastating policy when, during the first month of the war, more than half of the 6,120 people killed belonged to 1,340 families, many of which were eliminated completely.
Where’s Daddy? is an AI system that tracks targeted individuals so that the IDF can assassinate them. This AI along with The Gospel, Lavender and others represent a paradigm shift in the country’s targeted killing programme. In the case of Where’s Daddy? the IDF would purposefully wait for the target to enter his home and then order an air strike, killing not only the target but also his entire family and other innocents in the process. As one Israeli intelligence officer asserted: “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.” In an even more horrific turn, sometimes the targeted individual would not even be at home when the air strike was carried out, due to a time lapse between when Where’s Daddy? sent out an alert and when the bombing took place. The target’s family would be killed but not the target. Tens of thousands of innocent Palestinians, primarily women and children, are believed to have been killed because of this.
Israeli AI software also permeates the occupied West Bank, where it is a part of everyday Palestinian life.
In Hebron and East Jerusalem, Israel uses an advanced facial recognition system dubbed Red Wolf. Red Wolf is utilised to monitor the movements of Palestinians via the many fixed and “flying” security checkpoints. Whenever Palestinians pass through a checkpoint, their faces are scanned without their approval or knowledge and then checked against other Palestinian biometric data. If an individual gets identified by Red Wolf due to a previous detention, or their activism or protests, it decides automatically if this person should be allowed to pass or not.
If any person is not in the system’s database, their biometric identity and face are saved without consent and they are denied passage. This also means that Israel has an exhaustive list of Palestinians in its database which it uses regularly to crack down not just on so-called militants, but also on peaceful protesters and other innocent Palestinians. According to one Israeli officer, the technology has “falsely tagged civilians as militants.” Moreover, it is highly likely that Red Wolf is connected to two other military-run databases – the Blue Wolf app and Wolf Pack. IDF soldiers can even use their mobile phones to scan Palestinians and access all private information about them. According to Amnesty International, “Its [Red Wolf’s] pervasive use has alarming implications for the freedom of movement of countless Palestinians…”
For years, Israel has used the occupied Palestinian territories as a testing ground for its AI products and spyware. In fact, the country sells “spyware to the highest bidder, or to authoritarian regimes with which the Israeli government wanted to improve relations” on the basis that they have been “field tested”. Israel’s own use of this technology is the best advertisement for its products. The head of Israel’s infamous Shin Bet internal spy agency, Ronen Bar, has stated that it is using AI to prevent terrorism and that Israel and other countries are forming a “global cyber iron dome”. The glaring issue here is Israel’s violation of Palestinian rights through spying on their social media as well as wrongful detention, torture and killing innocent people. “The Israeli authorities do not need AI to kill defenceless Palestinian civilians,” said one commentator. “They do, however, need AI to justify their unjustifiable actions, to spin the killing of civilians as ‘necessary’ or ‘collateral damage,’ and to avoid accountability.”
Israel has entered into a controversial $1.2 billion contract with Google and Amazon called Project Nimbus, which was announced in 2021. The project’s aim is to provide cloud computing and AI services for the Israeli military and government. This will allow further surveillance and the illegal collection of Palestinian data. Google and Amazon’s own employees dissented and wrote an article to the Guardian expressing their discontent about this. “[O]ur employers signed a contract… to sell dangerous technology to the Israeli military and government. This contract was signed the same week that the Israeli military attacked Palestinians in the Gaza Strip – killing nearly 250 people, including more than 60 children. The technology… will make the systematic discrimination and displacement carried out by the Israeli military and government even… deadlier for Palestinians.” The contract reportedly has a clause that disallows Google and Amazon to leave the contract so the companies’ acquiescence is axiomatic. According to Jane Chung, spokeswoman for No Tech For Apartheid, over 50 Google employees have been fired without due process due to their protests against Project Nimbus.
The Palestinians are perhaps the bravest people in the world.
Whether contained within the barrel of a gun, a bomb casing, or in the code of an AI system, Israeli oppression will never deter them from standing up for their legitimate rights. Their plight has awoken the world to the nature of the Israeli regime and its brutal occupation, with protests and boycotts erupting in the West and the Global South. Using their propaganda media channels, the US and Israel are trying to placate the billions who support Palestine, even as the genocide remains ongoing. Israel hopes that its Machiavellian system will demoralise and create an obsequious Palestinian people – people whose screams are silenced – but as always it underestimates their indefatigable spirit which, miraculously, gets stronger with every adversity.
The views expressed in this article belong to the author and do not necessarily reflect the editorial policy of Middle East Monitor or Informed Comment.