The challenges raised by increasingly autonomous weapons
On June 24, 2014, the ICRC Vice-President, Ms Christine Beerli, opened a panel discussion on "The Challenges of Increasingly Autonomous Weapons", organized at La Maison de la Paix (Geneva) by the ICRC, the Swiss Department of Foreign Affairs and the Geneva Academy of International Humanitarian Law and Human Rights.
Speech by Ms Christine Beerli, Vice-President of the International Committee of the Red Cross
Ladies and gentlemen,
It is my great pleasure to speak here to celebrate the 150th anniversary of the original Geneva Convention. The International Committee of the Red Cross is pleased to be a co-organizer of this event together with the Swiss Federal Department of Foreign Affairs and the Geneva Academy.
The issues raised by new technologies of warfare are a key part of the ICRC’s wider focus on the varied and complex challenges of contemporary armed conflicts, including the need to ensure better respect for international humanitarian law.
I will concentrate most of my remarks on weapons, in particular new weapons. International humanitarian law imposes limits through its general rules on the conduct of hostilities and through specific treaty and customary rules limiting or prohibiting the use of weapons that cause unacceptable harm.
Over the past 150 years States have agreed to limitations or prohibitions on existing or newly developed weapons due to their human cost. And during this time the ICRC has been at the forefront of alerting States to the humanitarian impact of certain weapons with the aim of protecting civilians from their indiscriminate effects and combatants from unnecessary suffering.
But before turning to the implications of recent developments, I would like to stress that today, one of the greatest dangers to civilians and civilian infrastructure comes not from new technologies but from old ones. The use of conventional rockets, mortars, bombs and missiles in populated areas has disastrous consequences. Due to the significant likelihood of indiscriminate effects, the ICRC considers that explosive weapons with a wide impact area should be avoided in densely populated areas. This position is guiding the ICRC in its ongoing dialogue with States on the impact of these weapons.
Nevertheless, we must at the same time keep abreast of new technologies in order to anticipate and prevent future humanitarian problems. Of these new weapon technologies, combat drones and autonomous weapon systems have given rise to particular debate.
The distinction drawn by the ICRC between combat drones and autonomous weapon systems is based on the human role in identifying and attacking targets. Although this could change in the future, drones currently require a human operator to choose the target and fire the weapon, whereas autonomous weapon systems select and attack targets independently of human intervention.
The deep controversy around the use of armed drones has produced two opposing camps. Proponents say that their relative precision greatly minimizes the risk of incidental civilian damage. Critics, on the other hand, say that drone attacks can constitute extra-judicial killing, especially outside conflict zones, and that hundreds if not thousands of civilians have died in these attacks.
The ICRC endeavours to assess the effects of drone attacks and whether they might be the result of violations of international humanitarian law, although these assessments are hampered by difficulties in collecting first-hand information.
Under IHL, there is no specific rule or prohibition on drones. As for manned combat aircraft, their use is governed by the rules on the conduct of hostilities, and there is nothing specific to drones that would prevent implementation of these rules when drones are used in an armed conflict. The responsibility for complying with IHL lies with the drone operators, their commanders and the relevant party to the conflict. In contrast, given the strict limits on the use of force imposed by the law-enforcement framework, it would appear that drones could be relied on in law enforcement only in exceptional circumstances.
The debate about autonomous weapon systems has come to the fore more recently. The ICRC first raised its concerns in 2011. In March this year we held an international experts’ meeting bringing together 21 States and independent experts to explore the technical, military, legal and ethical issues. We have also been contributing to discussions held in the framework of the Convention on Certain Conventional Weapons.
From our point of view the fundamental issue is human control over the use of weapons and consequently over the use of force. Today we are witnessing the development of weapons with increasing autonomy in the “critical functions” of identifying and attacking targets. The prospect of allowing machines to make life-and-death decisions on the battlefield with little or no human involvement raises legal and ethical concerns.
From a legal perspective it is crucial to ensure that any such weapons are not employed if respect for international law cannot be guaranteed. To expand on one concern, if weapon systems become more autonomous – with greater “freedom” to determine their operations and attacks – they may become less predictable. This unpredictability raises important questions, for example: What assurance is there that a weapon system will always operate within the law? How can the weapon system be adequately tested and verified and its lawfulness assessed?
On this topic we must also look beyond international law and beyond the technology itself to confront a more fundamental question of humanity – and here the Martens Clause provides a bridge with international law – is it ethically, morally, and socially acceptable, and indeed desirable, to substitute machines for humans when making decisions on the use of force? Amidst the complexities of the ongoing discussions one thing seems clear: there is a sense of deep discomfort with the idea of any weapon system that places the use of force beyond human control.
While new weapons technologies present profound challenges, the application of new technologies in general also offers real opportunities to change the way humanitarian actors work. The ICRC has started to use new technologies to advance its efforts in the following areas: assistance – for example, use of mobile cash transfers and crowdsourcing to create maps; protection – including use of satellite imagery to assess situations; and prevention – such as the use of professional video-game graphics as training tools.
In doing this we must not lose sight of our core responsibility and we must ensure that the use of new technologies always enhances our work and never diminishes it. In particular, there is no substitute for direct human contact with vulnerable populations in areas affected by violence.
The impact of new technologies in warfare is a central question both for humanitarian law and for humanitarian policy. With this in mind, the ICRC has sought to expand the discussion in recent months through a series of public debates.
Our first Research and Debate Cycle, on New Technologies and the Modern Battlespace ran from March to June this year, with six debates worldwide on:
- the challenges of new technologies in warfare
- autonomous weaponry in armed conflict in Washington, DC
- new warfare technologies and challenges for protection in Boston
- cyber warfare in Geneva
- use of technologies to enhance the cognitive and physical performance of soldiers in Melbourne
- and finally, earlier today, an online seminar on Technological Innovation and Principled Humanitarian Action.
By way of conclusion, I would like to return specifically to the challenges raised by increasingly autonomous weapons and to emphasize that what’s needed is a holistic reflection that fully addresses the risks and implications of such technologies from multiple perspectives. To this end, the ICRC will continue to urge States to consider the potential legal, ethical and human consequences well before developing and using new weapons.