By Melike Pala | –
( Middle East Monitor ) – Israel’s use of artificial intelligence (AI) funded by European Union research programmes to target civilians is attracting a lot of criticism. Since the Israeli attacks on Gaza began on 7 October, 2023, the EU has provided over €238 million ($246m) to Israeli institutions for research and innovation. The funds are believed to have supported the development of AI-driven “location and killing” technology used by Israel against Palestinian civilians in Gaza.
Nozomi Takahashi, a member of the board of directors of the European Coordination of Committees and Associations for Palestine (ECCAP), told Anadolu that they are aware of allegations about EU funds aiding AI technologies targeting civilians. Takahashi said that they had addressed the issue in letters to high-level EU officials, including former EU foreign policy chief Josep Borrell.
She pointed to AI-based systems used by the Israeli army called “Habsora” (The Gospel), “Lavender” and “Where is Daddy?” She said that these systems are used “to identify, locate and kill the targets in the current genocide in Gaza.”
Emphasising that these systems are used indiscriminately against civilians, Takahashi noted that, “Such extrajudicial killing is prohibited by international law. The scale and frequency of civilians killed in Gaza using such AI systems are devastating.”
The ECCAP official highlighted the EU’s particular focus on AI development, and said that Israeli research institutions are also involved in various EU-funded projects in this field. However, identifying which EU-funded project underpins those used by the Israeli army is impossible due to confidentiality and secrecy. “The potential high risk associated with such technology in the hands of a government that has a record of human rights violations should raise the alarm.”
Only civilian projects, added Takahashi, are eligible for funding through the Horizon Europe programme. “The development of such AI technology further blurs the border between civil and military applications.” She criticised the EU for its “narrow focus” when evaluating the goals of the projects that it funds, with insufficient monitoring and overlooking the potential for their use in the military.
Takahashi highlighted that Horizon Europe’s ethical principles require funded projects to uphold “respect for human dignity, freedom, democracy, equality, the rule of law and human rights, including the rights of minorities.” However, the research entity’s history with military activities or human rights violations is “neither questioned nor required” during ethics reviews, she claimed.
According to Eman Abboud, a lecturer at Trinity College Dublin, it has been demonstrated that EU funds have financed arms companies under the guise of civil security and tech research. She said that the EU is “culpable” by supporting the military industry in Israel — the state is currently facing genocide charges at the International Court of Justice (ICJ) — through its funding programmes.
“Israeli companies such as Elbit Systems Ltd. and Israel Aerospace Industries, which profit from and are deeply complicit in Israel’s long-term violent oppression and apartheid, as well as the current genocide of the Palestinian people, have received funding for security research from European funding programmes,” explained Abboud.
Criticising the ability of organisations contributing to human rights violations and the undermining of international humanitarian law to benefit from EU funds, she said, “The EU has refused to sever its trade links with Israel or ban them from Horizon Europe,” despite the ongoing ICJ case against the occupation state.
“Lavender Genocide Bot,” Digital, Midjourney, 2024
She referenced EU-GLOCTER, a “counter-terrorism” project involving Israeli institutions, noting the links to Israel’s military and intelligence, including Reichman University’s International Institute for Counter-Terrorism (ICT), which was co-founded by a former intelligence chief. “We must understand that institutions like these provide the means to create the intelligence apparatus that is used to target specific civilians in Gaza and in Lebanon. We cannot separate them, given the strategic dual use of academic research funding and military research funding.”
The AI technology developed within the Israeli military named Habsora, generating automated and real-time targets, frequently strikes civilian infrastructure and residential areas, with the number of civilian casualties always being known in advance.
The Lavender technology analyses data collected on approximately 2.3 million people in Gaza using ambiguous criteria to assess the likelihood of an individual’s connection to the Palestinian resistance group Hamas.
Sources told Tel Aviv-based +972 and Local Call that, early in the Gaza attacks, the military was “completely reliant” on Lavender, automatically targeting males it flagged, without oversight or specific criteria. Lavender has marked approximately 37,000 Palestinians as “suspects”.
Using the AI-based system called “Where is Daddy?” Israel simultaneously tracks thousands of individuals and when they enter their homes targeted individuals are bombed, with no regard for the presence of civilians, including women and children.
These AI technologies are known to make computational errors frequently and disregard the principle of “proportionality”. They have played a significant role in the killing of over 45,850 Palestinians since 7 October, 2023.
The views expressed in this article belong to the author and do not necessarily reflect the editorial policy of Middle East Monitor.