Ban autonomous weapons, urge AI experts including Hawking, Musk and Wozniak

Over 1,000 experts in robotics have signed an open letter in a bid to prevent a "global AI arms race".

Luke Westaway Senior editor
Luke Westaway is a senior editor at CNET and writer/ presenter of Adventures in Tech, a thrilling gadget show produced in our London office. Luke's focus is on keeping you in the loop with a mix of video, features, expert opinion and analysis.
Luke Westaway
3 min read

A sentry robot points its machine gun during a test in South Korea. AI researchers warn that robots should not be allowed to engage targets without human intervention. KIM DONG-JOO/AFP/Getty Images

Robotics experts from around the world have called for a ban on autonomous weapons, warning that an artificial intelligence revolution in warfare could spell disaster for humanity.

The open letter, published by the Future of Life Institute, has been signed by hundreds of AI and robotics researchers, as well as high-profile persons in the science and tech world including Stephen Hawking, Tesla CEO Elon Musk and Apple co-founder Steve Wozniak. Celebrated philosopher and cognitive scientist Daniel Dennett is among other endorsers who've added their names to the letter.

Developments in machine intelligence and robotics are already impacting the tech landscape -- for instance, camera-equipped drones are prompting new debates on personal privacy and self-driving cars have the potential to revolutionise the automotive industry. However, many experts are concerned that progress in the field of AI could offer applications for warfare that take humans out of the loop.

The open letter defines autonomous weapons as those that "select and engage targets without human intervention". It suggests that armed quadcopters that hunt and kill people are an example of the kind of AI that should be banned to prevent a "global AI arms race."

"Autonomous weapons are ideal for tasks such as assassinations, destabilising nations, subduing populations and selectively killing a particular ethnic group," the letter continues. "We therefore believe that a military AI arms race would not be beneficial for humanity."

Watch this: The organisation that wants to stop killer robots

Speaking to CNET a few weeks ago, roboticist Noel Sharkey, who has signed his name to this latest petition, warned that the killer robots of real life will be a far cry from the fantastical sci-fi depictions we see on screen.

"They will look like tanks," Sharkey said. "They will look like ships, they will look like jet fighters."

"An autonomous weapons system is a weapon that, once activated or launched, decides to select its own targets and kills them without further human intervention," explains Sharkey, who is a member of the Campaign to Stop Killer Robots -- an organisation launched in 2013 that's pushing for an international treaty to outlaw autonomous weapons. "Our aim is to prevent the kill decision being given to a machine."

The open letter sites examples of successful international agreements regarding other types of weapons, such as chemical or blinding laser weapons. "Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons -- and do not want others to tarnish their field by doing so, potentially creating a major public backlash against AI that curtails its future societal benefits," the letter reads.

While the latest open letter is concerned specifically with allowing lethal machines to kill without human intervention, several big names in the tech world have offered words of caution of the subject of machine intelligence in recent times. Earlier this year Microsoft's Bill Gates said he was "concerned about super intelligence," while last May physicist Stephen Hawking voiced questions over whether artificial intelligence could be controlled in the long-term. Several weeks ago a video surfaced of a drone that appeared to have been equipped to carry and fire a handgun.