X

UN should ban 'killer robots' before it's too late, report urges

Human rights watchdogs want the UN to put an end to the fully autonomous and weaponized drones before they even exist.

Charlie Osborne Contributing Writer
Charlie Osborne is a cybersecurity journalist and photographer who writes for ZDNet and CNET from London. PGP Key: AF40821B.
Charlie Osborne
4 min read

The X-47B prototype has proven to the US Navy that an autonomous drone can fit into operations on an aircraft carrier.
The X-47B prototype has proven to the US Navy that an autonomous drone can fit into operations on an aircraft carrier. US Navy photo by Mass Comm. Spec. Seaman Anthony N. Hilkowski

The United Nations is under pressure to ban fully autonomous drones before they are developed.

In a new report released by Human Rights Watch and Harvard Law School, the groups argue that so-called "killer robots," fully autonomous weapons able to inflict harm without operators, should be banned before they come into existence. The report details how a lack of regulation could cause human deaths without accountability.

At the moment, drones and autonomous vehicles -- ranging from sensor-laden scouts to consumer hobby drones and self-driving cars -- are being developed at a rapid pace. Companies including Amazon are harnessing the technology for delivery purposes, Google is experimenting with a fully self-driving car, and Parrot is a start-up which now offers a range of hobby drones to consumers.

Considering the technology scene only a few decades ago, the possibility of these machines being taken a step further for military use is not outside the realm of possibility. While regulators are exploring different avenues for the regulation of consumer-based drones and unmanned aerial vehicles (UAVs), the report argues that rather than lawmakers falling into a hole where regulations are playing catch-up with technology, laws should be set in place before such technology arrives.

In the military realm, the US for several years has been flying Predator and Reaper drones in Afghanistan and elsewhere that have fired missiles to destroy targets on the ground. Those unmanned aerial vehicles are under the control of a human operator, often working remotely from thousands of miles away. The US Navy, meanwhile, has been experimenting with a pair of X-47B prototype drones that are capable of flying autonomously, and a debate among members of the US Congress and the Defense Department remains unsettled about whether follow-on designs should focus on unarmed surveillance missions or be armed to carry out strikes.

As reported by The Guardian, the report says that under current laws, programmers, manufacturers and military personnel would all escape liability for deaths caused on the field by fully autonomous weaponry. The report, titled "Mind the Gap: The Lack of Accountability for Killer Robots," also suggests that there is not likely to be any legal framework which would clearly state where responsibility lies in the production and deployment of such weapons -- and therefore no retribution or restitution when errors occur.

"Fully autonomous weapons do not yet exist, but technology is moving in their direction, and precursors are already in use or development," the report argues. "For example, many countries use weapons defense systems -- such as the Israeli Iron Dome and the US Phalanx and C-RAM -- that are programmed to respond automatically to threats from incoming munitions. In addition, prototypes exist for planes that could autonomously fly on intercontinental missions (UK Taranis) or take off and land on an aircraft carrier (US X-47B)."

The controversial factor in autonomous weaponry is the lack of meaningful human control in selecting and engaging targets. By rescinding control to a machine, there is the possibility of civilians being targeted instead of military, a potential arms race to develop more sophisticated and dangerous weaponry, and "proliferation to armed forces with little regard for the law," the report suggests.

"Existing mechanisms for legal accountability are ill-suited and inadequate to address the unlawful harms fully autonomous weapons might cause," the groups argue. "These weapons have the potential to commit criminal acts -- unlawful acts that would constitute a crime if done with intent -- for which no one could be held responsible. A fully autonomous weapon itself could not be found accountable for criminal acts that it might commit because it would lack intentionality."

Drones and automated weaponry currently used by governments are defended as a human operator is always behind the decision to pull the trigger or not. Therefore, a person is held accountable in the case of war crimes and misuse. However, researchers from Human Rights Watch and Harvard Law School believe military personnel and operators could "not be assigned direct responsibility" for the actions of a fully autonomous weapon, except in rare situations where intent to misuse such weapons can be proved. The report states:

"An alternative approach would be to hold a commander or a programmer liable for negligence if, for example, the unlawful acts brought about by robots were reasonably foreseeable, even if not intended. Such civil liability can be a useful tool for providing compensation for victims and provides a degree of deterrence and some sense of justice for those harmed. It imposes lesser penalties than criminal law, however, and thus does not achieve the same level of social condemnation associated with punishment of a crime."

The report continues:

"The lack of meaningful human control places fully autonomous weapons in an ambiguous and troubling position. On the one hand, while traditional weapons are tools in the hands of human beings, fully autonomous weapons, once deployed, would make their own determinations about the use of lethal force.

They would thus challenge long-standing notions of the role of arms in armed conflict, and for some legal analyses, they would be more akin to a human soldier than to an inanimate weapon.

On the other hand, fully autonomous weapons would fall far short of being human."

Human Rights Watch and Harvard Law School recommend that the "development, production and use" of fully autonomous weapons be prohibited through an international legally binding policy, and national laws be adopted which would also prevent this type of weaponry from being created nationally.

The report has been released ahead of a meeting of international officials at the UN in Geneva later this month, which will include a discussion on the regulation of emerging military technology.

This story originally appeared at ZDNet under the headline "UN urged to ban fully autonomous drones, weapons before they exist."