Happy or sad? Your future car might know the difference

Using deep learning technology, Affectiva developed an emotion recognition engine, and it promises to make cars much more human.

Affectiva app demonstration

Affectiva's demonstration app looks at a face and analyzes it for emotional expression in real time.

Affectiva

"A smile is a smile all over the world." Abdelrahman Mahmoud, product manager at Affectiva, doesn't say this as a hopeful affirmation, he knows it from empirical evidence. His company spent the last eight years building an emotion recognition engine, currently used for market research and being developed for use in cars.

Emotion-aware cars could use alerts to make distracted drivers pay attention, soothe angry drivers, or, for a highly-automated car, figure out how best to hand control back over to a human driver.

Increasingly sophisticated driver assistance technology lets cars take over many driving tasks, a trend that will only increase as features such as a highway pilot, which could handle all highway driving between destinations, come into production. But these technologies may not drive in a manner that a human passenger would like. And a big debate among autonomous car technologists is how to hand back control to a human driver who has not been engaged with the driving task for hundreds of miles.

Mahmoud believes that giving cars knowledge of their passengers' mental states could alleviate these problems. For example, if a self-driving car could know when you are fearful, it could change its driving style to be less aggressive. If a car could tell when its passengers were confused or frightened, it could display its route and final destination to provide reassurance that it knows where it is going.

Affectiva's system uses a camera focused on the car occupants' faces, identifying 33 facial landmarks, Mahmoud pointed out during a presentation at Nvidia's GPU Technology Conference in San Jose today. That information gets interpreted by the computer's neural network, based on Affectiva's deep learning of over 5 million facial expressions. The computer identifies seven emotions, such as joy, surprise and fear, along with other useful attributes such as engagement.

The company has fed its network facial expressions from 75 countries. Mahmoud says that smiles have the same basic shape across cultures, but show different levels of exaggeration. People raised in Thailand, for example, tend to have a more subtle smile than people from other countries.

And while it may seem creepy to have your car watching your face and responding to your emotions, Affectiva's system does not send the video it records to a central server -- it is able to do all its emotion recognition processing on each individual installation. It will only need to connect to a server to update its recognition models to improve its accuracy.

Mahmoud said that Affectiva is currently working with several German and Japanese automakers, and although he would not reveal much about the company's clients, a prototype using the technology should be out next year.

Automakers have been interested in driver monitoring technology, but most efforts have focused on determining whether a driver is asleep at the wheel or looking away from the road. Affectiva's software offers more detailed information about people, which could make future self-driving cars seem less like soulless robots and more responsive to our needs.

Affectiva makes a working demonstration of its technology available with Android and iOS apps, available from links here.

Detroit Auto Show 2017: Everything that happened at the biggest car show of the year.

Favorite concept cars at the 2017 Detroit Auto Show: The auto industry's coolest moonshots.

Close
Drag
Autoplay: ON Autoplay: OFF