Charles in charge: Nav system knows how you feel
A Cambridge University professor is developing an emotionally intelligent navigation system that can read your emotions and respond accordingly. This could bode well for road rage.
A Cambridge University professor is developing a navigation system that does what most boyfriends can't: read your emotions, sense what's going on, and adapt to the situation.
Just kidding about the boyfriend part.
Charles is a robot that is more co-pilot than GPS device. Frustrated by unintuitive gadgets that aren't helpful--let alone interactive--Professor Peter Robinson, who leads the Rainbow Group working on computer graphics and interaction at Cambridge, developed an emotionally intelligent navigation system that can tell how you're feeling and respond accordingly.
The system uses sensors and algorithms of predefined mental states to track facial cues, tone of voice, body language, and posture. Using this information, Charles can read human emotion with a 70 percent accuracy rate, which is on par with human ability, Robinson says in a YouTube video demonstrating his project.
But reading emotion is only one aspect of the robot's capability. Charles can also respond with human-like emotion.
With cameras for eyes and 24 motors for muscles, the robot's head and mouth moves as it gives directions and mimics human expressions. Unlike current GPS systems, Charles politely tells you where to go based on conversation. Should you not agree with the directions Charles provides, you can suggest an alternate route. Instead of saying it's recalculating or insisting on the programmed route, the robot actually agrees with your decision.
Despite the obvious advantages, Charles is a long way from replacing TomTom as your GPS device. However, the system could also end up as part of a next-generation safety feature. A navigation system that senses your emotions may be able to block calls if it detects the driver is stressed, the professor said in an interview with The Telegraph. Call-blocking capability is already available in Ford's MyFord Touch and MyLincoln Touch system, and it's not too much of a stretch to imagine that feature integrated with an emotion-sensing GPS system.
Audi is testing in-cabin cameras and sensors that track head position to assess drivers' attentiveness. If the system detects that a driver isn't paying attention, it may activate emergency braking earlier to avoid crashes. To reduce road rage, a system like Charles could limit speed or take over driving all together--should autonomous cars ever become the norm.
It may sound invasive, but the polite, human-like personality of Charles makes it seem less threatening. To see what the future of emotion-sensing navigation systems looks like, watch the professor's video of his project: