Revealed: 'Lavender' AI system's role in Gaza strikes raises war crime concerns
A few days after a drone strike targeted a humanitarian convoy in the Gaza Strip, a baffling leak surfaced. +972 Magazine unveiled a highly sophisticated system, reportedly utilized by the Israelis. This revelation holds particular significance in light of the recent casualties among volunteers in Gaza.
The system, known by the codename "Lavender," is said to be a cutting-edge artificial intelligence-based platform for guiding kinetic strikes.
Despite its complexity and integration with other advanced technologies, "Lavender" is reportedly prone to errors, raising serious concerns under international law. A fundamental rule of armed conflict is to avoid civilian casualties, but reports suggest that Israeli strategies might be contradicting this principle.
According to the magazine, the technology has led to civilian deaths, allegedly directing strikes at the homes of Hamas militants even during visits to their families or periods of rest.
Israeli operations against Hamas
The magazine's sources claim that at the onset of Israel's counterattacks in Gaza, the Israeli army deemed 15-20 civilian casualties acceptable for a successful strike on even a low-ranking Hamas member. For higher-ranking targets, the acceptable casualty figure was purported to rise to as many as 100 civilians.
The article advances, suggesting that at one point, 37,000 Palestinians were designated as "targets" by the Israelis, with artificial intelligence being tasked with eliminating many, especially those of lower ranks.
This context brings additional scrutiny to an attack that claimed the life of Polish volunteer Damian Soból, a native of Przemyśl, Poland, who was killed in an Israeli army assault. The convoy from World Central Kitchen, aimed at delivering food aid that had recently arrived from Cyprus, was attacked repeatedly in the Gaza Strip.
Seven individuals lost their lives in this tragedy. The attack transpired during an effort to provide urgently needed food supplies.
Legal implications?
The details shared by the magazine suggest a deliberate leak intended to expose the "Lavender" system and similar technologies. Nonetheless, Commander Wiesław Goździewicz, an expert in international humanitarian law, believes the leak was genuine. He argues that blaming such an attack on a machine would be detrimental for Israel, akin to harming oneself.
"This might indicate Israel's violation of the fundamental principles of IHL, particularly the distinction principle," the lawyer notes.
Among the five core principles of IHL is the obligation to differentiate between combatants and civilians, as well as between military and civilian objectives, focusing attacks solely on the former.
The report implies a possible breach of the proportionality principle, which prohibits attacks expected to cause disproportionate collateral damage to the anticipated military gain.
Crucially, as Goździewicz has highlighted in a study, the ethical conundrum is: If a fully sentient AI deployed in a drone commits a potential war crime, who bears responsibility? His answer: the creators, implementers, and commanding officers. "AI, after all, is a human creation; thus, humans are accountable for its actions."
"The guilt of an act committed with a firearm lies with the person, not the weapon," the officer elucidates.
Thus, those behind systems that lead to civilian casualties could face war crime charges, emphasizing the need for mechanisms ensuring accountability for the use of such technology.
Similarly, the recent attack on a humanitarian convoy, if proven to be an unwarranted use of force resulting in civilian deaths, should lead to legal consequences for those responsible.