Autonomous weapons: States must agree on what human control means in practice

20 November 2018
Autonomous weapons: States must agree on what human control means in practice

Geneva (ICRC) – Should a weapon system be able to make its own “decision” about who to kill?

The International Committee of the Red Cross (ICRC) believes that the answer is no, and today is calling on States to agree to strong, practical and future-proof limits on autonomy in weapon systems.

During the annual meeting of the States party to the Convention on Certain Conventional Weapons in Geneva FROM 23 TO 21 November, the ICRC will urge that the new mandate of the Group of Governmental Experts focuses on determining the type and degree of human control that would be necessary to comply with international humanitarian law and satisfy ethical concerns. Several questions need to be answered:

  • What is the level of human supervision, including the ability to intervene and deactivate, that would be required during the operation of a weapon that can autonomously select and attack targets?
  • What is the level of predictability and reliability that would be required, also taking into account the weapon’s tasks and the environment of use?
  • What other operational constraints would be required, notably on the weapon system’s tasks, its targets, the environment in which it operates (e.g. populated or unpopulated area), the duration of its operation, and the scope of its movement?

"It is now widely accepted that human control must be maintained over weapon systems and the use of force, which means we need limits on autonomy," said the ICRC President Peter Maurer. “Now is the moment for States to determine the level of human control that is needed to satisfy ethical and legal considerations."

Only humans can make context-specific judgements of distinction, proportionality and precautions in combat. Only humans can behave ethically, uphold moral responsibility and show mercy and compassion. Machines cannot exercise the complex and uniquely human judgements required on battlefields in order to comply with international humanitarian law. As inanimate objects, they will never be capable of embodying human conscience or ethical values.

Given militaries’ significant interest in increasingly autonomous weapons, there is a growing risk that humans will become so far removed from the choice to use force that life-and-death decision-making will effectively be left to sensors and software.

“Humans cannot delegate the decision to use force and violence to machines. Decisions to kill, injure and destroy must remain with humans. It is humans who apply the law and are obliged to respect it,” said Kathleen Lawand, the head of the ICRC’s arms unit.

Note to editors/reporters: Kathleen Lawand and Neil Davison, ICRC experts in arms issues, can provide more information on autonomous weapons.


For further information, please contact:


Marie-Servane Desjonquères, ICRC, tel: +962 7 7843 7401 or