The ongoing conflict in Gaza has underscored the urgent need for comprehensive international regulations concerning the use of Artificial Intelligence (AI) in military applications. One concerning manifestation of this issue is the emergence of Artificial Genocidal Intelligence (AGI), wherein AI systems are deliberately programmed to facilitate acts of genocide.
The recent military operations conducted by Israel in Gaza exemplify the chilling implications of AGI and mass surveillance, including the deployment of Lethal Autonomous Weapon Systems (LAWS) and semi-LAWS. Reports reveal that the Israeli military employs sophisticated AI tools, such as Lavender and Gospel for surveillance and targeting operations, resulting in civilian casualties and destruction. In light of these developments, the international community must act urgently to strengthen regulatory frameworks
This emerging threat of AGI challenges the boundaries of AI ethics, international law, and human rights. The Israeli Military uses both LAWS and Semi-LAWS. LAWS are a special class of weapon systems that can autonomously identify and strike targets while semi-LAWS engage targets that have been selected by a human operator.
Lavender, an AI system developed by the Israeli military, plays a crucial role in the extensive bombing of Palestinians. It analyzes surveillance data on Gaza’s population of 2.3 million, assigning scores from 1 to 100 to individuals based on perceived connections to Hamas. Another AI tool, Gospel, identifies structures allegedly used by militants. This system has significantly increased target identification rates, from around 50 annually to nearly 100 daily. However, airstrikes often hit residential buildings lacking militant presence, pushing the civilian death toll over 41,000 and sparking concerns about the accuracy of Gospel’s algorithm and its compliance with International Humanitarian Law (IHL).
Alchemist is a sophisticated system monitoring Gaza’s border, providing real-time alerts to military personnel on potential threats. Meanwhile, Where’s Daddy? is specifically designed to track individuals on a military kill list and execute strikes. By linking mobile phone data to suspected militants’ identities, it has facilitated strikes against individuals in their homes, raising serious ethical questions about civilian safety and legal standards. Red Wolf, a facial recognition system deployed at checkpoints in the West Bank, controls Palestinian movement by scanning faces to verify crossing permissions. This system has faced criticism for enforcing discriminatory movement restrictions.
At the international law front, the United Nations (UN) has condemned LAWS as "politically unacceptable and morally repugnant." The Convention on Certain Conventional Weapons (CCW) provides a basis for regulating indiscriminate weapons, with LAWS becoming a central focus since 2014. In 2018, a Group of Governmental Experts (GGE) was formed to assess the legal, ethical, and humanitarian implications of LAWS and draft guidelines for their responsible use. The international community is urged to establish a legal framework prohibiting LAWS development and use, prioritizing human oversight, accountability, and civilian protection.
Notably, IHL principles—distinction, proportionality, and military necessity—raise serious concerns over AI deployment in warfare. The principle of distinction mandates that combatants and civilians be clearly differentiated, however, Lavender and Gospel, by relying on AI, cannot effectively distinguish civilians from legitimate targets, leading to wrongful targeting risks. Where’s Daddy? and Red Wolf, while not directly targeting civilians, impose severe restrictions that may lead to indiscriminate harm or infringements, violating the principle of distinction. Proportionality, which prohibits excessive civilian harm relative to military gain, is also at risk. Lavender’s 10% error rate means it may wrongfully target civilians, leading to disproportionately high collateral damage. Gospel has similarly caused extensive civilian casualties. Finally, the principle of military necessity allows only necessary actions toward a military objective. Where’s Daddy? and Red Wolf justify civilian surveillance and restricted movement under military necessity; however, these systems often lack a direct military objective.
To regulate such AGI, the UN should strengthen the existing CCW framework instead of creating new regulations. The CCW, which already oversees emerging military technologies, can incorporate AGI discussions through its established protocols, like those for blinding lasers. A key focus should be on human oversight and accountability.
Additionally, UNSC should recognize AGI use as a breach of IHL due to civilian harm and regional instability. It should draft a resolution demanding an immediate end to Israel’s AGI use, with calls for transparency and cooperation with international investigations. The UNSC could also raise global awareness on the illegality of such actions under IHL. As a CCW member, Pakistan should actively participate in the AGI regulation process by attending GGE meetings, contributing to the development of international norms on AGI.
Palestine could also engage UN human rights bodies via treaty mechanisms to address AGI use. While treaty bodies lack enforcement, their recommendations influence laws and policies. Additionally, complaint mechanisms in treaties like the ICCPR allow states to address non-compliance. The Special Committee on Israeli Practices monitors and reports on human rights violations linked to such technologies.
COMMENTS
Comments are moderated and generally will be posted if they are on-topic and not abusive.
For more information, please see our Comments FAQ