Utter the phrase "killer robot" to someone and you'll likely see them smile, a grin spreading as they picture a 10-foot chrome monster firing its lasers and bleating electronically about exterminating humans. But while sci-fi 'bots -- even the ones that turn evil -- occupy a fond place in our hearts and on our movie screens, the reality of lethal, autonomous machines is much closer, and more terrifyingly mundane, than we might think.
"They will look like tanks. They will look like battleships. They will look like jet fighters," says robotics professor Noel Sharkey, speaking to CNET about the Campaign to Stop Killer robots. The organisation is working to get a global international treaty to ban robots that kill without human prompting. In the video above, we let the Campaign explain why it believes the decision to kill must remain in human hands.
From missile defence systems to sentry robots and unmanned aircraft, we already have existing or in-development automated military systems that act with very little human input. The Campaign suggests that closing the loop and letting a machine pull the trigger would cross a significant ethical line.
Made up of non-governmental organisations including Article 36 and Human Rights Watch, the group says that giving a military machine the responsibility of taking human life is morally and pragmatically problematic. Could a computer adequately deal with shifting combat scenarios -- for instance, if an enemy stronghold becomes occupied by civilians in the time it takes a drone to fly there, will the robot know to hold its fire? And if it does launch its missiles and kill civilians, who's responsible for their deaths?
The increasingly autonomous robots of war (pictures)See all photos
With artificial intelligence becoming increasingly sophisticated, we may have to make up our minds about robots in warfare sooner than we think. Watch the video above and let us know your thoughts in the comments below. For more discussion on killer robots, listen to this week's CNET UK Podcast, embedded below.