Magazine article Reason

Killer Robots: Protectors of Human Rights? Why a Ban on the Development of Lethal Autonomous Weapons Is Premature

Magazine article Reason

Killer Robots: Protectors of Human Rights? Why a Ban on the Development of Lethal Autonomous Weapons Is Premature

Article excerpt

"States should adopt an international, legally binding instrument that prohibits the development, production, and use of fully autonomous weapons," declared Human Rights Watch (HRW) and the International Human Rights Clinic (IHRC) in an April statement. The two groups issued a report titled "Killer Robots and the Concept of Meaningful Human Control," as experts in weapons and international human rights were meeting in Geneva to consider what should be done about lethal autonomous weapon systems (LAWS). It was the third such meeting, conducted under the auspices of the Convention on Conventional Weapons.

What is a lethal autonomous weapons system? Depends on who you ask, but the U.S. definition provides a good starting point: "A weapon system that, once activated, can select and engage targets without further intervention by a human operator." Experts typically distinguish among technologies where there is a "human in the loop" (semi-autonomous systems, in which a person controls the technology as it operates), a "human on the loop" (human-supervised autonomous systems, in which a person can intervene and alter or terminate operations), and a "human out of the loop" (fully autonomous systems that operate independently).

The authors of that April statement want to ban fully autonomous systems, because they believe a requirement to maintain human control over the use of weapons is needed to "protect the dignity of human life, facilitate compliance with international humanitarian and human rights law, and promote accountability for unlawful acts."

HRW and IHRC argue that killer robots would necessarily "deprive people of their inherent dignity." The core argument here is that inanimate machines cannot understand the value of individual life and the significance of its loss, while soldiers can weigh "ethical and unquantifiable factors" while making such decisions. In addition, the groups believe that LAWS could not comply with the requirements of international human rights law, specifically the obligations to use force proportionally and to distinguish civilians from combatants. They further claim that killer robots, unlike soldiers and their commanders, could not be held accountable and punished for illegal acts.

Yet it may well be the case that killer robots could better protect human rights during combat than soldiers using conventional weapons do now, according to Temple University law professor Duncan Hollis in a January 2016 article in the Temple International and Comparative Law Journal.

Hollis notes that under international human rights law, states must conduct a legal review to ensure that any armaments, including autonomous lethal weapons, are not strictly speaking unlawful--that is, they are neither indiscriminate nor employ disproportionate force. To be lawful, a weapon must be capable of distinguishing between civilians and combatants. Also, it must not by its very nature cause unnecessary suffering or superfluous injury. A weapon would also be unlawful if its deleterious effects cannot be controlled.

Considerations like these have persuaded most governments to sign treaties outlawing the use of such indiscriminate, needlessly cruel, and uncontrolled weapons as antipersonnel land mines and chemical and biological agents. If killer robots could better discriminate between combatants and civilians and reduce the amount of suffering experienced by people caught up in battle then they would not be per se illegal.

Could killer robots meet these international human rights standards? Ronald Arkin, a roboticist at the Georgia Institute of Technology, thinks they could. In fact, Arkin argues in the journal Communications of the Association for Computing Machinery, LAWS could have significant ethical advantages over human combatants. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.