Towards limits on autonomy in weapon systems

09 April 2018

 

Convention on Certain Conventional Weapons (CCW) Group of Governmental Experts on Lethal Autonomous Weapons Systems, statement of the ICRC

The International Committee of the Red Cross (ICRC) is pleased to contribute its views to this second meeting of the Group of Governmental Experts on “Lethal Autonomous Weapon Systems”.  The ICRC wishes to acknowledge and thank the Chair for leading these discussions and wish him success in his efforts this week.

Building on the valuable work done to date, this meeting provides an opportunity to consider in more detail the complex challenges posed by autonomous weapon systems.  We are hopeful that the meeting will advance progress towards a common understanding of the characteristics of these weapons.  Setting a broad scope of the discussion will be important in this regard.

In the ICRC’s view, the purpose of identifying the characteristics of the systems under consideration should be to identify the features that distinguish autonomous weapon systems from those controlled directly by humans, including remote-controlled weapons.  The objective at the outset need not be to define systems of concern.

From the ICRC’s perspective, the focus must be on the functions of weapon systems that are most relevant to legal obligations and ethical concerns within the scope of the Convention on Certain Conventional Weapons, namely autonomy in the critical functions of selecting and attacking targets.  Autonomy in other functions (such as movement or navigation) would not in our view be relevant to the discussions.

Experiences with existing weapon systems with autonomy in their critical functions should be harnessed for a greater understanding of the legal and ethical issues raised by autonomy in weapon systems more generally, and of the nature of human control over the use of force that must be retained for legal compliance and ethical acceptability.  For the ICRC, it is crucial that discussions draw on real-world technologies and near-term developments of autonomy in weapon systems.  To do otherwise would risk overlooking incremental developments in autonomy that raise concerns and that may require legal and policy responses from the international community.

The degree of human control over weapon systems – not their technological sophistication – should be the yardstick in these discussions. It is humans – not machines, computer programs or weapon systems – who apply the law and are obliged to respect it.

International humanitarian law (IHL) requires that those who plan, decide upon and carry out attacks make certain judgements in applying the norms when launching an attack.  Ethical considerations parallel this requirement – demanding that human agency and intention be retained in decisions to use force.  Humans therefore bear responsibilities in the programming, development, activation and operational phases of autonomous weapon systems.

Mindful of the potential adverse humanitarian consequences of the loss of human control over weapons and the use of force, the ICRC has posited that a minimum level of human control is necessary from both a legal and ethical perspective.  In the view of the ICRC, a weapon system beyond human control would be unlawful by its very nature.  But beyond these so-called “fully autonomous weapon systems”, there is a need consider the full range of risks associated with weapon systems that have autonomy in their critical functions.

The ethical concerns around loss of human agency in decisions to use force, diffusion of moral responsibility and loss of human dignity could have far-reaching consequences, perhaps precluding the development and use of anti-personnel autonomous weapon systems, and even limiting the application of anti-matériel systems, depending on the risks that destroying matériel targets present for human life.

The ICRC continues to urge all States in this meeting to elaborate what “meaningful” or “effective” human control entails in practice. States must also address fundamental concerns about weapon systems that may introduce inherent unpredictability, such as those employing artificial intelligence (AI) machine-learning algorithms.

The affirmations by CCW States Parties in this Group that IHL is both relevant and applicable to any emerging weapon technology, including autonomous weapon systems, is heartening.  The ICRC welcomes efforts to improve implementation of IHL, including through enhancing mechanisms to review the legality of new weapons.  Conducting legal reviews of weapon systems with autonomy in their critical functions can raise challenges, in particular regarding standards of predictability and reliability. The ICRC encourages the exchange of information and experiences about these processes, and views such efforts as complementary to the work of the Group of Governmental Experts.

This being said, the ICRC remains convinced that the overall purpose of this Group should be to agree limits on autonomy in weapon systems.  As noted by the ICRC in previous meetings on this topic, technological developments that remove or reduce direct human control over weapon systems are threatening to outpace international deliberations, and States must therefore approach this task with some urgency.  A “human‑centred” approach must guide the identification of limits to autonomy in weapon systems and of possible options to address “autonomous weapon systems of concern”.