(Credit: Columbia Pictures)
Human Rights Watch has released a report calling for a ban on robots designed to kill humans.
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Thus state the Three Laws of Robotics, penned by science-fiction author Isaac Asimov in 1942. At the time, they seemed completely make-believe, outlandish and foreign.
Now they sound completely reasonable — especially considering the Human Rights Watch has declared killer robots a threat to civilian lives and called for a pre-emptive global ban on fully autonomous weapons.in the Middle East by the British and US militaries. So much so, in fact, that non-government organisation
In a 50-page report titled "Losing Humanity: The Case Against Killer Robots" (PDF), the organisation details the concerns associated with deploying killer robots — perhaps most worryingly, the lack of accountability. A robot has no way to tell civilians from military targets, and no moral compass, so how can it be punished for civilian deaths?
Fully autonomous weapons do not yet exist, but their development and deployment, Human Rights Watch believes, is only a matter of time. Semi-autonomous drones are already in action, with more efficient machines being researched and developed by militaries around the world.
Steve Goose, Arms Division director at Human Rights Watch, said, "It is essential to stop the development of killer robots before they show up in national arsenals. As countries become more invested in this technology, it will become harder to persuade them to give it up."
Maybe we should just create robots to duke it out in space if go to war we must, leaving innocent civilians to live out their lives peacefully down here.