X

AI's just not that into you -- yet

Artificial-intelligence engineers are tackling new challenges, like teaching robots about emotions. But we're still a long way from a weepy iPhone.

Ben Fox Rubin Former senior reporter
Ben Fox Rubin was a senior reporter for CNET News in Manhattan, reporting on Amazon, e-commerce and mobile payments. He previously worked as a reporter for The Wall Street Journal and got his start at newspapers in New York, Connecticut and Massachusetts.
Ben Fox Rubin
3 min read
Getty Images

For all their brilliance, our phones still have as much emotional intelligence as glue. Yet, as electronics become ever more important in our lives, it may make sense to start teaching them to be more aware of our feelings.

Early glimpses of such efforts were afoot at a gathering of over 700 artificial-intelligence software developers, academics and researchers this week in Manhattan, where several talks focused on finding ways to make our robots, voice assistants and chatbots more, well, emotional.

"People are building these very intimate relationships with these companions, but right now these companions have no empathy," Rana el Kaliouby, CEO of emotional-recognition tech firm Affectiva, said onstage Tuesday at the inaugural O'Reilly Artificial Intelligence Conference.

Teaching robots about emotion illustrates the promise and the huge challenges in developing AI tools. Artificial intelligence, which lets machines mimic human learning and problem solving, is already used to improve Google searches and scan Facebook photos for faces. Major tech companies are racing to develop more AI software to power self-driving cars and add more intelligence into their devices, everything from cars to refrigerators.

The new O'Reilly conference -- which brought out speakers from Google, Intel, Facebook and Microsoft -- and the creation this week of an AI research nonprofit backed by a who's-who of tech heavyweights are just two signs of AI's growing prominence, with more AI programs making their way out of academia and into the real world.

But there's tons of work that still needs to happen to make this tech more useful. Teaching a machine using AI techniques is time-consuming, and, after this training, the machines can be tricked by even minor curveballs.

"We're still somewhat in the beginning, but you can already see that people are excited to take this into their businesses," Ben Lorica, O'Reilly Media's chief data scientist, said in an interview at the conference. "There's definitely going to be a flowering of these intelligent applications."

How Hollywood sees AI, in 16 films

See all photos

A new kind of companion

There are potential benefits from teaching phones and robot helpers how to read our voices and faces for emotion. Voice assistants like Siri or Alexa might be able to better understand our needs if they could pick up on our sarcasm, happiness or anger. In one example, el Kaliouby suggested your phone might even be able to gauge your mental well-being by reading your facial expressions.

And home-assistant bots could become more persuasive when trying to get you to take your daily medicine or exercise, she said. Advertisers and film studios may also use these tools to test ads and movie trailers.

Researchers highlighted a handful of early examples of emotional robots. Pepper, a big-eyed, humanoid robot built by Japan's SoftBank Robotics, was trained to identify joy, sadness, anger and surprise. Xiaoice, a chatbot created by Microsoft and available in China and Japan, offers "emotional support" by trading messages with its more than 40 million users.

Smaller projects include a furry, social robot from MIT, called Tega, which aims to help students learn. A smart speaker from the Hong Kong University of Science and Technology, called MoodBox, changes its lighting and suggests different kinds of music to play depending on how you feel.

Giving robots emotional capabilities raises obvious concerns about user privacy and intrusive or creepy robots.

We don't have to panic quite yet (besides, your phone is looking right at you). As with many other advanced AI tools, there are plenty of challenges to making emotion-sensing robots a reality. For instance, when building up a library of facial expressions to teach robots, it's hard to categorize what anger or fear even looks like. Maybe it's sometimes both.

Also, some emotions are expressed through both vocal inflections and facial expressions, so machines may need to learn to track both, and in real time. We humans, after all, are fairly complicated when it comes to emotions.

So those hurdles mean we still have a long way to go before your smart fridge can cry along with you while you watch "The Notebook" together.