Article

Autonomous Weapon Systems and International Humanitarian Law: Selected Issues

Position paper by the International Committee of the Red Cross
autonomous weapons

To support current international efforts to regulate autonomous weapon systems (AWS), the ICRC has prepared this position paper to clarify how international humanitarian law (IHL) applies to AWS, highlight the specific humanitarian and legal challenges they raise, and contribute to discussions on new rules. Building on its mandate to promote and strengthen IHL, the ICRC publishes this paper to assist States in ensuring the protection of civilians and upholding the principle of humanity in the face of emerging technologies of warfare.

ICRC’s concerns about AWS

In the ICRC’s view, as well as many States and other actors, AWS are weapon systems that, once activated, can select and engage one or more targets without further human intervention. After initial activation or launch, an autonomous weapon system triggers a strike in response to information from the environment received through sensors and on the basis of a generalized "target profile". As a result, the user does not choose, or even know, the specific target(s) and the precise timing and/or location of the resulting application of force.

The use of AWS entails serious risks due to the difficulty of anticipating and limiting their effects. The loss of human control and judgement in decisions over life and death raises profound humanitarian, legal and ethical concerns. In particular, AWS:

  • pose risks of harm to those affected by armed conflict, both civilians and combatants, as well as dangers of conflict escalation.
  • raise challenges for compliance with international law, including IHL, notably, the rules on conduct of hostilities; and
  • raise fundamental ethical concerns by delegating life and death decisions to machines, which diminishes both the moral agency of the users and the human dignity of those against whom force is used.

Regardless of the sophistication of AWS and associated sensor, software and robotics technologies, it is important to emphasize that IHL obligations regarding the conduct of hostilities must always be fulfilled by humans. It is not the weapon system that must comply with IHL, but the humans using it.

Clarifying IHL and addressing legal uncertainty


In 2019, the High Contracting Parties to the Convention on Certain Conventional Weapons (CCW) reaffirmed the applicability of IHL to AWS, a position echoed by UNGA resolutions in 2023 and 2024. As with any other weapon, existing IHL already regulates and constrains the development and use of AWS.

To uphold and reinforce this framework and to provide developers and users with the clarity needed to comply with legal obligations, it is essential to clearly articulate the specific conditions and limits on the development and use of AWS that may be derived from these principles and rules, including the types of AWS that cannot be employed in compliance with IHL. Without such clarity, there is a risk that future practices may erode protections currently afforded to those affected by armed conflict under IHL, as well as undermine the principles of humanity and public conscience.

This position paper sets out the ICRC’s views on selected IHL issues raised by AWS. It examines how IHL rules — particularly those on means and methods of warfare and the rules on conduct of hostilities — apply to AWS, and identifies the challenges posed by the functioning and potential effects of AWS. It also explores ways to address the challenges AWS pose to IHL compliance, to strengthen legal certainty and to ensure the responsible use of AWS in armed conflict.

The need for new legally binding rules

As reflected throughout this position paper, while IHL applies to AWS, existing rules do not hold all the answers to the humanitarian, legal and ethical questions raised by AWS. States hold differing views about what limits and requirements for the design and use of AWS derive from existing rules of IHL. The ICRC is therefore convinced that new rules are urgently needed to clarify and specify how IHL applies to AWS, as well as to address wider humanitarian risks and fundamental ethical concerns. These rules should include specific prohibitions and restrictions on the development and use of AWS and should safeguard the protections afforded by IHL against being undermined by increasing autonomy in weapon systems. Any such limits would be additional and complementary to existing IHL rules, including weapons treaties, and would not displace them.