Auto Tech

Feeling sad, angry? Your future car will know

At Nvidia's developer conference, high-tech company Eyeris demonstrated how its Emovu Driver Monitor System can detect your emotions from your facial expressions and how your car can respond.

Eyeris Emovu presentation

Emovu DMS analyzes driver emotions using a camera and a deep learning network.

Eyeris

Did you know that passengers invariably show a fear reaction when the brakes are applied in a car? That is just one of the things facial monitoring company Eyeris learned when developing its Emovu Driver Monitoring System (DMS). Using a combination of cameras, graphic processing and deep learning, Emovu analyzes the passengers in a car, determining from facial movement which of seven emotions these passengers are feeling.

Modar JR Alaoui, CEO of Eyeris, demonstrated the company's in-car technology during Nvidia's GTC developer conference, putting forth a few ideas of how monitoring the emotions of drivers can lead to safer driving.

The company used deep learning to train its Emovu software to recognize facial expressions. It captured 1.25 million photos and videos showing people of five races and four age groupings, tagging that imagery with the emotions expressed by the subjects. Letting its deep learning network analyze the imagery, this system can now very accurately look at images of people it has never seen before and identify emotions.

The Emovu DMS can be installed in a car, using a camera facing the driver, and determine if that driver is angry, sad, happy, surprised, fearful, disgusted or expressing no emotion.

Driver monitoring systems exist today, but generally just show when a driver is distracted or tired. Eyeris' Emovu DMS offers more specific data about the driver's state of mind.

Pointing out Department of Transportation data that shows 80 percent of accidents happen due to driver inattention or rage, Alaoui said the information about a driver's emotions could be used by the car to take pre-crash actions, such as tightening seat belts or preparing braking. An autonomous car of the future could actually take over the driving if it felt its human wasn't up to the task.

Another interesting datapoint could be derived from correlating driver emotions to locations. If every driver going through a particular intersection exhibits surprise, the local transportation authority could investigate to see if its infrastructure is causing a problem and correct it.

Eyeris isn't alone in developing this type of driver analysis. At this year's CES, Swiss company Nviso exhibited similar technology as part of the FANCI concept dashboard. Given that camera technology is becoming increasingly inexpensive, these types of driver monitoring systems seem like a sure bet in upcoming cars.