Article

"Risks from the unconstrained use of autonomous weapons in armed conflict are stark"

Statement given by Mirjana Spoljaric, President of the ICRC, at the Luxembourg Autonomous Weapons Systems Conference. ICRC President addresses the potential dangers of unconstrained use of autonomous weapons in armed conflict.

Check Against Delivery

 

Excellencies, Ladies and Gentlemen,

I am very pleased to be here at the University of Luxembourg, and I thank the Luxembourg Directorate of Defence, and Minister Bausch in particular, for organizing this conference at such a critical juncture.

We are witnessing the rapid development of autonomous weapon systems, including those controlled by artificial intelligence, together with military interest in loosening the constraints on where – or against what – those weapons will strike. These developments led the International Committee of the Red Cross to call on governments to establish new international constraints that are clear and binding.

We are not alone in calling for action. There has been heightened political mobilization along similar lines, including a joint statement by 70 states at the United Nations General Assembly last October. In February this year, the Bélen Communiqué was endorsed by 33 Latin American and Caribbean states, for which I commend Vice-Minister Guillermet-Fernández on Costa Rica's convening role. There was also a clear sense of momentum in last month's discussions at the UN Convention on Certain Conventional Weapons.

And, of course, I would like to acknowledge that Luxembourg and Costa Rica have been among those recognizing the need for specific prohibitions and regulations on autonomous weapons, as well as the clear position of the United Nations Secretary-General, and the important role of civil society organisations and the scientific community.



Although many opportunities for society arise from developments in information and robotics technologies, the risks from the unconstrained use of autonomous weapons in armed conflict are stark.

Loss of human control over the use of force in armed conflict risks jeopardizing the protections for both combatants and civilians and brings dangers of conflict escalation. It would undermine belligerents' ability to abide by international humanitarian law, or "IHL", including the obligations of those planning and deciding upon attacks to anticipate and limit their effects. It would endanger their ability to make the human judgments required to comply with the rules governing the conduct of hostilities.
Autonomous weapons also fundamentally challenge our shared values, our humanity. Should we tolerate a world in which life and death decisions are reduced to machine calculations?

It can be helpful to ground the policy debate in current and historical realities of warfare. Militaries are deploying weapons with increasingly autonomous functions. Many of today's remote-controlled weapons could become tomorrow's autonomous weapons with just a software update or a change in doctrine.

Take the example of a loitering weapon – a cross between a missile and a drone which "loiters" over an area before striking an object or person below. We understand that, for now, most are guided to a target chosen by a human operator. But some militaries and manufacturers have indicated their intention to allow such weapons to strike autonomously.

Looking back, there is a parallel to anti-personnel landmines, which were banned in 1997. They demonstrated the harm that can be caused when the weapon user – the person who places the mine – does not specifically choose who will be targeted or when the weapon will be triggered. With today's mobile autonomous weapons, the user may not even know precisely where a strike will take place.

The potential loss of human control that can occur due to this process of using force is why we are so concerned by the further development and use of autonomous weapons.

A call on new, binding international rules is not a call we make often, or lightly. While IHL already applies to and sets constraints on the design and use of autonomous weapons, we see that states hold differing views on what particular limits and requirements on the design and use of autonomous weapons derive from existing rules. We believe new law is needed to bring clarity in this regard, to uphold and strengthen legal protections, and to respond to ethical concerns.

What is needed now is political leadership for international action to negotiate and adopt a legally binding instrument regulating autonomous weapon systems.

Thank you again, Minister Bausch, for making this a priority for Luxembourg. I urge other European states to similarly make this a priority. To use a phrase adopted by Luxembourg: Let's make it happen.