Killer robots. They're closer than you think, and need to be powered down before they alter warfare forever.
So says the Campaign to Stop Killer Robots, which recently kicked off its appeal to ban autonomous military machines. Click play on the video above to hear the campaign's reasoning, and make up your own mind whether automated drones and murderous 'bots represent the inevitable future of warfare, or an odious threat to our humanity.
We speak to the marvellous Professor Noel Sharkey, perhaps best known as a judge on the BBC's Robot Wars, who now helps front the campaign. You'll also hear from Jody Williams of the Nobel Women's Initiative, who believes robots that kill without input from humans cross a moral line.
The campaign says that machines like automatic flying drones or autonomous tanks put civilians increasingly at risk, as they're devoid of the humanity and gut judgement that could give a human soldier pause for thought.
It may not just be moral quandaries that self-controlled military robots create however, but legal ones too -- with no humans involved in managing or controlling these lethal devices, who exactly is responsible for their actions?
I asked Sharkey whether a lack of military advances -- traditionally an area that breeds speedy tech development -- could inhibit the world of technology on a broader scale. "The only thing we're trying to stop is the kill function," he told me.
Do you think that killer robots should be regulated, and soon? Or is a future filled with self-managing robots inevitable? Should humans remain in the loop when it comes to high-tech warfare, or would it be better if robots -- rather than people -- took charge of warfare? Let me know in the comments, or on our Facebook wall.