If I had a dime for every time someone writes "I, for one, welcome our new robot overlords," I'd have enough money to bribe my future robot master into sparing me from the meatsack ghettos.
Our dystopian robot future is always good grist for lame jokes. Unless it might actually happen.
Human Rights Watch seems very serious about a new campaign it has launched against what it calls "killer robots."
"Urgent action is needed to preemptively ban lethal robot weapons that would be able to select and attack targets without any human intervention," the international non-governmental organization said in a release promoting its Campaign to Stop Killer Robots.
"Lethal armed robots that could target and kill without any human intervention should never be built," said Steve Goose, Arms Division director at Human Rights Watch.
"A human should always be 'in-the-loop' when decisions are made on the battlefield. Killer robots would cross moral and legal boundaries, and should be rejected as repugnant to the public conscience."
Human Rights Watch is calling for an international treaty banning the development, production, and use of fully autonomous weapons, as well as local laws against them.
It says killer robots would lack "human judgment and the ability to understand context," and create an "accountability gap" around robot actions: who would be responsible for use of lethal force, the programmer, manufacturer, or military commander?
It also fears a "robot arms race" could prompt nations to engage in war more easily since military casualties could be mitigated. In other words, if robots are doing the fighting, it's that much more easy to start a fight.
The campaign comes amid increased humanitarian and privacy concerns about robots used by the military and law enforcement, especially drones, which can operate on autopilot and by remote control. According to the New America Foundation, U.S. drone strikes killed 2,439 to 3,982 people in Pakistan and Yemen between 2004 and 2013.
A U.S. Department of Defense 2012 directive (PDF) requires humans to be "in the loop" when a robot uses lethal force, and says that "autonomous weapon systems may be used to apply non-lethal, non-kinetic force, such as some forms of electronic attack, against materiel targets in accordance with DoD Directive 3000.3."
Drone systems abound
Human Rights Watch, however, says high-level officials could waive a policy. "In effect, it constitutes the world's first moratorium on lethal fully autonomous weapons," the NGO adds. "While a positive step, the directive is not a comprehensive or permanent solution to the potential problems posed by fully autonomous systems."
As more than 70 countries already have drone systems, enforcing any ban would take enormous political will. Meanwhile, the technology is evolving rapidly.
"We're moving into more and more autonomous systems. That's an evolutionary arc," military robotics analyst Peter Singer, author of "Wired for War," has been quoted as saying by AFP.
"So the role moves from being sort of the operator from afar, to more like the supervisor or manager, and a manager giving more and more of a leash, more and more independence."