Ethical superiority
- Because LAWs could be programmed to follow the Laws of Armed Conflict, and given their robotic nature, they would not be subject to human failings, which will permit them to comply more rigorously with International Humanitarian Law (IHL), specifically following in a highly precise matter the principles of distinction, proportionality, and military necessity.
Casualty reduction
- LAWs will substitute for human soldiers and as consequence reduce own-soldier casualties.
- LAWs’ better target precision capabilities could reduce collateral damage, such as civilian casualties or civilian property damage. LAWs, being a product of robotics, could be faster, stronger, and have better endurance than human soldiers in every perspective, not being subject to fatigue.
The following arguments have been offered against the development of LAWs:
Immorality
- Delegating the decision to kill to machines crosses a fundamental moral line.
Martens Clause violation
- LAWs could not fulfill the principles of humanity and therefore will be contrary to the dictates of public conscience, thus violating the Martens Clause as stated in the Additional Protocol I of 1977 to the Geneva Conventions.
Laws of Armed Conflict violation
- The complexity of the interrelation of the principles of distinction, proportionality, and military necessity, and their required value judgements, makes the Laws of Armed Conflict unprogrammable. Thus, LAWs would not be able to comply with IHL.
Unpredictability
- Because LAWs could be designed with machine learning algorithms, their actuation will be unpredictable and thus commanders would lose control of outcomes.
Algorithmic bias
- LAWs programs could be subject to human bias inserted in the algorithmic design process which would open the possibility of unethical discrimination and inhumane treatment.
Accountability
- It is uncertain how accountability could be addressed with LAWs because of the number of humans associated with the use or production of these weapons (operators, commanders, programmers, manufacturers, etc.). Neither criminal law nor civil law guarantees adequate accountability for individuals directly or indirectly involved in the use of autonomous weapons systems.
Totalitarian possibilities
- LAWs will lack the human capacity of acting against orders that seemed unethical or immoral to them and thus could more easily serve totalitarian purposes on the hands of commanders.
Lack of constraints
- LAWs will not be subject to human constraints given by emotions, empathy, and compassion, which work as an important check for humans in the killing of civilians.
Force Multiplier
- Because LAWs will distance humans from the risks and tragedies of war by enabling remotely driven tactics, they will make the political decision of going to war easier and thus function as a force multiplier, promoting more conflict rather than less. This will lead to a war paradigm shift where remoteness plays the central role.
Arms Race
- The development of LAWs would initiate a global arms race that will lead to increased international instability.
II. SOLUTION
The solution is an international preemptive ban on the development of lethal autonomous weapons adopted by the High Contracting Parties of the UN Convention on Certain Conventional Weapons.This ban would build on other humanitarian disarmament treaties and the preemptive ban of blinding laser weapons by the The Protocol on Blinding Laser Weapons, Protocol IV of the 1980 Convention on Certain Conventional Weapons.
The adoption of this solution depends in its entirety on the willingness of the parties to agree and adopt the ban.As of today, the call for the lethal autonomous weapons ban is being supported by the following 25 states:Algeria, Argentina, Austria, Bolivia, Brazil, Chile, Costa Rica, Colombia, Cuba, Djibouti Ecuador, Egypt, Ghana, Guatemala, Holy See, Iraq, Mexico, Nicaragua, Pakistan, Panama, Peru, State of Palestine, Uganda, Venezuela, and Zimbabwe. China has expressed support on a ban on the use of LAWs, not on their development.
There have been numerous citizen expressions from the technology industry in favor of the ban on the development of LAWs. Over 1,000 experts in robotics and artificial intelligence have signed two letters from the Future of Life Institute supporting the ban (Autonomous Weapons: An Open Letter) from AI & robotics Researchers; Lethal Autonomous Weapons Pledge. Signatories of these letters include Stephen Hawking, Elon Musk, Steve Wozniak, Noam Chomsky, Skype co-founder Jaan Tallinn, Google DeepMind co-founder Demis Hassabis, and others.
Seventy religious leaders, representatives and faith based organisations have signed an interreligious declaration, initiative of PAX in cooperation with Pax Christi International, calling on states to work towards a global ban on fully autonomous weapons.
More than 20 Nobel Peace Prize Laureates have endorsed a joint statement calling for a ban on weapons that would be able to select and attack targets without meaningful human control.
The United States and Russia have expressed the view that an international ban on lethal autonomous weapons would be premature. Instead, they encourage further analysis of the possible benefits this new technology could offer. The Foreign Office and the Ministry of Defence of the United Kingdom expressed its opposition to the international ban since it states that international humanitarian law already addresses the issue.
LAWs have not been fully developed yet. In fact, much of its proposed technology still does not exist. This positions the international community in an advantageous point where we can actually prevent, as we did with laser blinding weapons, a humanitarian catastrophe and its consequences altogether.
Footnotes for this article can be seen at the Footnotes 1 page on this website (link will open in a new page).