6. UN Convention on CCW and all states shall prohibit developing or deploying lethal autonomous weapons.

Rapporteurs: Erin Hunt and Gerardo Lebron Laboy, Mines Action Canada

I. THE PROBLEM

Lethal Autonomous Weapons (LAWs) refers to future weapons that would select their targets and engage (kill) based on their programming. They will be “autonomous” in the sense that they would not require human intervention to actuate1) (act or operate according to its programming). Being solely algorithmic driven, LAWs will be able to kill without any human interference or oversight.

The following arguments have been offered in support of the development of LAWs:

LAWs technology could offer better military performance and thus enhance mission effectiveness
  • LAWs, being a product of robotics, could be faster, stronger, and have better endurance than human soldiers in every perspective, not being subject to fatigue.
  • Better environmental awareness; robotic sensors could provide better battlefield observation.
  • Higher and longer range precision: Also, given advanced sensor technology, LAWs could have better target precision and a longer range.
  • Better responsiveness: LAWs will not be subject to the uncertainty in situational awareness that participants in military operations may go through because of communication problems or sight or vision obstructions (fog of war). Through an interconnected system of multiple sensors and intelligence sources, LAWs could have the capacity to update instantly more information than humans and faster, which would enable better awareness of their surroundings.
  • Emotionless advantage: LAWs would not have emotions that cloud their judgement.
  • Self sacrificing nature: LAWs would not have a self-preservation tendency and thus could be used in self sacrificing manners if needed and appropriate.
Ethical superiority
  • Because LAWs could be programmed to follow the Laws of Armed Conflict, and given their robotic nature, they would not be subject to human failings, which will permit them to comply more rigorously with International Humanitarian Law (IHL), specifically following in a highly precise matter the principles of distinction, proportionality, and military necessity.
Casualty reduction
  • LAWs will substitute for human soldiers and as consequence reduce own-soldier casualties.
  • LAWs’ better target precision capabilities could reduce collateral damage, such as civilian casualties or civilian property damage. LAWs, being a product of robotics, could be faster, stronger, and have better endurance than human soldiers in every perspective, not being subject to fatigue.

The following arguments have been offered against the development of LAWs:

Immorality
  • Delegating the decision to kill to machines crosses a fundamental moral line.
Martens Clause violation
  • LAWs could not fulfill the principles of humanity and therefore will be contrary to the dictates of public conscience, thus violating the Martens Clause as stated in the Additional Protocol I of 1977 to the Geneva Conventions.
Laws of Armed Conflict violation
  • The complexity of the interrelation of the principles of distinction, proportionality, and military necessity, and their required value judgements, makes the Laws of Armed Conflict unprogrammable. Thus, LAWs would not be able to comply with IHL.
Unpredictability
  • Because LAWs could be designed with machine learning algorithms, their actuation will be unpredictable and thus commanders would lose control of outcomes.
Read more

Author: Erin Hunt and Gerardo Lebron Laboy

1 thought on “6. UN Convention on CCW and all states shall prohibit developing or deploying lethal autonomous weapons.

  1. Lethal Autonomous Weapons (LAWs) are aptly called “killer robots,” though they don’t actually look like Arnold Schwartznegger. They decide whom to kill without consulting a person. You’d never want to get into a fight with one.

Leave a Reply

Your email address will not be published. Required fields are marked *