Kia, MIT to show a car interior that adapts to your mood at CES
The concept uses "bio-signal recognition technology" to figure out how you're feeling.
When (or if) the day comes when autonomous cars can transport you from point A to point B without any need for human interaction, what will those vehicles look and feel like? If they're pod-like devices, Kia believes those vehicles should at least make sure their passengers are as comfortable as possible -- even if that means computers reading passengers' emotions.
Kia announced that it has teamed up with the MIT Media Lab's Affective Computing group to develop a concept that will be shown next month at the CES show in January. Called READ, for Real-time Emotion Adaptive Driving, the system will apparently tailor an autonomous pod's environment to the moods and emotions of its passengers.
There are few details so far on what the pod might do to influence a rider's mood, but Kia says the system relies on "bio-signal recognition technology" and artificial intelligence to interpret how its human occupant feels. Then, the system can adjust "conditions relating to the human senses within the cabin" to give "a more joyful mobility experience." It's easy to imagine the READ system using mood lighting, ambient music and even potentially scents to calm or appease passengers.
"READ will enable continuous communication between drivers and vehicles through the unspoken language of 'emotional feeling,' thereby providing an optimized human senses-oriented space for drivers in real-time," Albert Biermann, Kia's head of research and development, said in a statement.
Kia says more details will be announced at CES, and either way, pod-like autonomous vehicles like this concept are still a long way off. However, Kia isn't the only automaker figuring out what we'll do when we no longer have to drive. Audi plans to show off an "on-the-road entertainment format" for future autonomous vehicles at CES, for instance.