If you're obsessed with "Westworld" or "Ex Machina," you might be preparing for humankind's impending doom. You know, the dawn of artificial intelligence.
Facebook has this advice: Take a deep breath.
The social network on Thursday launched a campaign aimed at demystifying AI by educating people on the basics of how it works. In a handful of short videos, Facebook explains the technology behind things like photo recognition, self-driving cars and language translation as part of an effort to assuage fears that AI will inevitably lead to robots running amok.
Artificial intelligence already goes into many of the online services you use every day. On Facebook and Instagram, AI helps rank what you see first, based on what it guesses you might find most interesting. Elsewhere in Silicon Valley, Google uses AI in everything from search to maps. Google CEO Sundar Pichai has said the search giant is moving from being "mobile-first" to "AI-first."
Facebook CEO Mark Zuckerberg estimates that a quarter of the company's engineers and more than 40 teams work on AI platforms. One big team, Facebook AI Research group, or FAIR, has more than 75 engineers and research scientists spread around the world. Another, the Applied Machine Learning group, has about 140 members.
The money and energy poured into AI hasn't diminished criticism. Stephen Hawking said AI could "spell the end of the human race," while Tesla CEO Elon Musk and other Silicon Valley elites launched a nonprofit research company called OpenAI, aimed at advancing AI in ways that benefit humanity as a whole.
Artificial intelligence and machine learning, a type of technology in which computers can learn without being explicitly programmed, has been an especially sensitive issue for Facebook since Donald Trump pulled off an upset victory in the US presidential election.
Filter bubble: Toil and trouble?
The social network's news feed algorithms, which are powered in part by machine learning and which decide what you see or don't see on Facebook, have came under fire in the aftermath of the election. Trump's detractors argue fake news circulated on people's news feeds played a role in his victory. Other critics contend "filter bubbles," the notion that Facebook users are only exposed to viewpoints that dovetail with their own, may have been the reason people were so blindsided.
Yann LeCun, who runs FAIR, said AI could "probably" reduce the filter bubble problem, but it's a question of how to deploy it into Facebook's services.
"We probably have the technology," LeCun said in a journalist roundtable last month, referring broadly to the company's software chops. "It's just, how do you make it work on the product side, not on the technology side?"
Facebook has declined to specifically address criticism of its role in the election, but a spokesman said the company hasn't built such tools.
CEO Mark Zuckerberg said at a conference shortly after the election that Facebook does show people stories they may not agree with, but that sometimes people just tune them out. "It's not that the diverse information isn't there," he said. "We haven't gotten people to engage with it in higher proportions."
And last month, he detailed a plan to fight fake news, including developing better "technical systems" to flag them before they gain traction.
The company has also been trying to infuse AI into more of its products than just Facebook.
Joaquin Candela, who runs AML, said Facebook-owned Instagram used AI earlier this year in making a big shift in ranking posts by what a user might find more interesting instead of just displaying them in reverse-chronological order. The change was easy because Instagram could just plug in some of Facebook's already-existing AI technology.
The videos Facebook released Thursday try to make the science behind AI more accessible. One video explains how computers can detect certain elements of a photo despite where they are located in a frame. Another video explains how AI can tell the difference between a photo of a car and a photo of a dog.
"It's not magic," Candela said during the meeting last month.
LeCun chimed in. "We try to make it look that way, but it's not."