Statement

Autonomous weapons: The ICRC remains confident that states will adopt new rules

Statement by the International Committee of the Red Cross prepared for the meeting of Group of Governmental Experts on Lethal Autonomous Weapons Systems of the Convention on Certain Conventional Weapons (CCW), 7–11 March 2021

The International Committee of the Red Cross (ICRC) welcomes the continued work of the Group of Governmental Experts (GGE) and urges the High Contracting Parties to the CCW to take their important work forward in line with one of the main purposes of this Convention, namely "the need to continue the codification and progressive development of the rules of international law applicable in armed conflict".

The expanding development and use of autonomous weapon systems bring this purpose into sharp focus. They give urgency to the need for the international community to respond effectively and with timeliness. 

At the end of last year, the ICRC urged the High Contracting Parties to agree an ambitious mandate for the GGE – one that sets out a path towards the adoption of new, international, legally binding rules.

Despite a lack of clarity in the direction of the new mandate, we trust that states will now seize the opportunity to articulate an effective response to the risks that autonomous weapons pose and one that does indeed "take into account the example of existing protocols within the CCW" and builds on past work.

The ICRC is encouraged that most High Contracting Parties, individually and jointly with others, have previously expressed their readiness to commit to not develop nor to use autonomous weapons that pose unacceptable risks and to commit to establish limits on all others.

This two-tiered approach aligns with the ICRC's recommendation to prohibit autonomous weapons that are unpredictable and those designed or used to target humans, and it aligns with our recommendation to strictly regulate the design and use of all other autonomous weapons.

The broad support for these commitments demonstrates a convergence of views among a significant and growing number of states, the ICRC, civil society, and leading figures in the scientific community. Autonomous weapons pose distinct risks to people affected by war, their use in compliance with international law, including international humanitarian law (IHL), presents a challenge, and certain types of autonomous weapons would cross legal and ethical lines. 

The ICRC is also encouraged that an increasing number of states view it as both necessary and feasible to articulate international limits on autonomous weapons in the form of new legally binding rules and have adopted national commitments to promote initiatives to this effect. The ICRC is confident that states will find a way to start negotiations to formalize these limits in a new legal instrument and remains convinced that such rules are needed.

Laws have clear benefits. New international rules would clarify and specify how existing IHL applies to autonomous weapons. They would develop and strengthen existing legal protections to address wider humanitarian risks and fundamental ethical concerns. The law offers the benefit of legal certainty and stability.

The Saint Petersburg Declaration set an example 154 years ago and the international community must continue to fix "the technical limits at which the necessities of war ought to yield to the requirements of humanity". It is not for the developers of military technologies alone to decide what these limits should be. 

Without such rules, the ICRC is concerned that further developments in the design and use of autonomous weapons may give rise to practices that erode the protections currently afforded to the victims of war under IHL and the principles of humanity.

For example, would political commitments or a set of principles offer the level of precision and clarity required to ensure states do not develop autonomous weapons that others deem illegal or wholly unacceptable?

Would these commitments or principles identify the conditions under which autonomous weapons can be used in compliance with the law? Would such instruments instil confidence among the High Contracting Parties that any principles will be honoured, and reassure them that there will be recourse in the event of any concerns?

Discussions over the past eight years at the CCW meetings suggest that such non-legally binding measures alone will not offer an effective or timely response to the many serious challenges posed by autonomous weapons.

The fact that fundamental questions about the role of humans in applying the law remain unresolved – and differences of opinion persist among states – only underlines the need for new legally binding rules.

In the ICRC's view, the process of negotiating new rules on autonomous weapons will provide the opportunity to resolve fundamental legal questions and ethical concerns. Non-legally binding measures can be complementary and mutually reinforcing, most notably when operationalizing new rules. 

The ICRC sometimes hears the argument that there is not sufficient evidence of humanitarian harm to warrant the adoption of new rules on autonomous weapons. However, in our view, such arguments lose sight of the long-standing concerns about weapons that select and apply force to targets without human intervention, sometimes referred to as "victim-activated weapons". The acute risks that such weapons pose to civilians, because their effects are difficult to predict and control, are well documented.

The High Contracting Parties to this Convention have responded to these risks, including by placing increasingly stringent international, legally binding limits on the use of landmines in the 1980 CCW Protocol II and again in 1996. In light of the particularly serious humanitarian concerns posed by anti-personnel mines, a majority of states banned these altogether in yet another international, legally binding instrument.

Together, these instruments constituted major advancements for the protection of civilians and the progressive development of IHL. And yet, with the expanding development of autonomous weapon systems, the underlying problem with this process of using force is left largely unaddressed: that of a weapon system triggering a strike itself while the user does not choose, or even know, the specific target and the precise timing and/or location of the resulting application of force. 

Against the backdrop of evolving technologies and practices of warfare, it is now time to tackle the fundamental challenges and serious risks that come with this process of applying force, including the acute ethical concern with applying force to human beings in this way. It is time to act with determination to prevent future humanitarian crises. 

The ICRC remains confident that states will find an international legal response that is commensurate with these risks. We look forward to engagement in more detail in discussions on specific proposals during the rest of this week.