X

Stephen Hawking: Humans evolve slowly, AI could stomp us out

In the latest of his pessimistic thoughts on the future, the famed physicist warns yet again of the end of the human race.

Chris Matyszczyk
3 min read

hawking56.jpg
Stephen Hawking is not optimistic about the human race. BBC screenshot by Chris Matyszczyk/CNET

We think of ourselves as evolved creatures. It's just that sometimes we forget how slow that evolution is.

Along comes Stephen Hawking to remind us that artificial intelligence might just evolve a little quicker than we're prone to. The result could be the end of our evolution and, indeed, the end of us.

In a BBC interview published Tuesday, Hawking paints a picture of humanity not dissimilar to a splattered Jackson Pollock.

Hawking said he fears that a complete artificial intelligence would simply do away with us.

AI "would take off on its own, and redesign itself at an ever increasing rate," he mused. The result would quite simply be that this new, exalted intelligence would see no need for our cumbersome, turgid ways. Or, as he put it: "Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded."

This isn't the first time in recent months that Hawking has predicted our doom. In May, he warned that the moral goodness of AI depends on who controls it. In June, he cautioned that robots might simply turn out to be smarter than us.

In the latest warning, however, Hawking was asked about the new artificial intelligence that helps him speak. Developed by Intel, it learns how he thinks and begins to offer words that he might wish to use. Somehow, though, Hawking still couldn't offer a more positive view of AI's future (or ours).

It's not so easy to find much optimism in some of Hawking's more recent thoughts. Yes, he joined Facebook and joked about being an alien. But he's also warned that aliens might destroy us before our human-built robots will, simply because they'll take a look at us and rather dislike us.

At heart it seems that for all the progress made with respect to the God Particle, for instance, Hawking worries that we're likely to be terminated. Indeed, in September he offered the gnarly thought that the God Particle itself might become unstable and cause a "catastrophic vacuum decay."

It's tempting to be like Googlies, who seems to believe that any amount of engineering development must be a good thing.

But with Hawking giving such consistently dire warnings, it may be wise to contemplate how we might control the uncontrollable when some bright minds believe, for example, that cars should drive us, rather than the other way around.

On a more personal note, Hawking told the BBC that despite technological advances, he wants to carry on speaking in the somewhat robotic manner of his previous technology.

"It has become my trademark, and I wouldn't change it for a more natural voice with a British accent," he said.

However playful Hawking can be, though, one is left with his essentially pessimistic view of the future. He isn't alone either. Elon Musk, a founder of Tesla Motors and SpaceX, has likewise envisioned AI doom. And how many sci-fi movies have painted a beautiful, future world? And how many have offered something a little more frightening?