• Send page
  • Print page

Autonomous weapons: States must address major humanitarian, ethical challenges

02-09-2013 FAQ

Technological advances in weaponry mean lethal decisions on the battlefield could in the future be taken by machines operating without human intervention. Here, we examine the potential implications of such a profound change in the way war is waged, and caution against the use of such weapons unless respect for international humanitarian law can be guaranteed.

How could autonomous weapons distinguish between a combatant and a civilian? Would their sophisticated technology make them better than soldiers at identifying military targets? And who would be responsible if they violate international humanitarian law? Owing to the many unresolved questions surrounding the use of autonomous weapons, the ICRC is calling on States to assess the potential human cost and international humanitarian law implications of these new technologies of warfare.

What are autonomous weapons?

Autonomous weapons (also known as lethal autonomous robots or “killer robots”) would search for, identify and attack targets, including human beings, using lethal force without any human operator intervening. Unlike the highly automated defensive systems currently in use, designed for example to shoot down incoming missiles or artillery shells, autonomous weapons would operate outside tightly constrained spatial and temporal limits. They could learn to adapt their functioning in response to changing circumstances in the environment in which they are deployed.

Weapons of this kind have not yet been deployed on the battlefield. Nevertheless, this area attracts considerable interest and research funding so such weapons may well be a feature of warfare in the future.

There have been calls for a moratorium or a ban on the development, production and use of autonomous weapons. Does the ICRC support such calls?

The ICRC is not joining these calls for now. However, the ICRC is urging States to consider the fundamental legal, ethical and societal issues related to the use of autonomous weapons before they are developed or deployed in armed conflict, as required by international humanitarian law. The ICRC is concerned over the potential human cost of autonomous weapon systems and whether they are capable of being used in accordance with international humanitarian law.

What does international humanitarian law say about autonomous weapons?

There is no specific rule for autonomous weapons. However, the law says that States must determine whether the use of any new weapon or means or method of warfare that it develops or acquires would be prohibited by international law in some or all circumstances. In other words, the longstanding rules of international humanitarian law, in particular the rules of distinction, proportionality and precautions in attack, apply to all new weapons and technological developments in warfare, including autonomous weapons.

The central challenge for States is to ensure that autonomous weapons are capable of complying with all these rules. For example, it is currently unclear how such weapons could discriminate between a civilian and a combatant, as required by the rule of distinction. Indeed, such a weapon would have to be able to distinguish not only between combatants and civilians, but also, for instance, between active combatants and those hors de combat, and between civilians taking a direct part in hostilities and armed civilians such as law enforcement personnel or hunters, who remain protected against direct attack. An autonomous weapon would also have to comply with the rule of proportionality, which requires that the incidental civilian casualties expected from an attack on a military target not be excessive when weighed against the anticipated concrete and direct military advantage. Finally, an autonomous weapon would have to be capable of applying the required precautions in attack designed to minimize civilian casualties.

Is a drone a type of autonomous weapon?

Autonomous weapons would operate without human supervision, in contrast to the remotely piloted aircraft (also known as drones) in use today, which require human operators to select targets and activate, direct and fire the weapons carried.

What might be the potential benefits and risks of using autonomous weapons?  

Proponents of autonomous weapons argue that the sophisticated sensors and artificial intelligence employed by such systems mean that they would be more likely than a human soldier to correctly identify military objectives and to avoid unintended civilian casualties. They also argue that autonomous weapon systems would not be influenced by negative human emotions such as fear, anger and a desire for revenge. On the other hand, an autonomous weapon would lack positive human emotions such as compassion, as well as the human judgement and experience required to correctly assess a genuine attempt to surrender, or to evaluate the concrete and direct military advantage anticipated from a given attack. Furthermore, the deployment of such weapons would reflect a paradigm shift and a major qualitative change in the conduct of hostilities. Ultimately, the question is whether the dictates of public conscience would allow machines to make life-and-death decisions and to apply lethal force without human control.

Who is responsible if the use of an autonomous weapon results in a violation of international humanitarian law?

As a machine, an autonomous weapon could not itself be held responsible for a violation of international humanitarian law. This then raises the question of who would be legally responsible if the use of an autonomous weapon results in a war crime: the programmer, the manufacturer or the commander who deploys the weapon? If responsibility cannot be determined as required by international humanitarian law, is it legal or ethical to deploy such systems?

Because so many questions remain unanswered, the ICRC is calling on States to ensure that autonomous weapons are not employed if compliance with international humanitarian law cannot be guaranteed.