The device, called the Emotional Social Intelligence Prosthetic, or ESP, was presented by Rana El Kaliouby on Tuesday at the 2006 Body Sensor Network Conference at the MIT Media Lab. The research team hopes the device will help people with autism learn to better read the social cues of others.
"Mind-reading" is a psychology term for the subconscious notice and analysis of nonverbal cues, such as facial expressions and head movements, which humans regularly use to determine the emotional states of others.
"Mind-reading is something we do all the time subconsciously," El Kaliouby told CNET News.com. "We use behavior and nonverbal cues to analyze the state of those you are speaking with to modify your own actions and those of others by trying to motivate them," she said.
El Kaliouby is developing the ESP device for her postdoctoral project as part of the Affective Computing research group at the MIT Media Lab under Rosalind Picard. Alea Teeters, also a member of the group and the ESP project, demonstrated the device. The project stems from El Kaliouby's doctoral work at the University of Cambridge, in which she developed the computational model on which the device is based. Like humans, the system determines emotional states by analyzing hierarchical combinations of subtle facial movements and gestures, such as eyebrow raising, lip pursing and head nodding.
The ESP consists of an, a tiny wearable video camera, an earphone and a small vibrating device that can be worn on a belt. The camera can be attached to a baseball hat, or worn around the neck on a stand akin to a harmonica holder.
The ESP camera can be worn facing outward by the speaker, or as a self-cam by the listener. As conversation ensues, the device "mind-reads" for the wearer. When the listener, whom the camera is focused on, begins to exhibit signs of boredom, the speaker is signaled so that she can readjust her behavior to bring the listener back into the conversation.
The device is especially useful to those with Autism Spectrum Condition (ASC). people with ASC often lack the ability to evaluate others' emotions on their own. The result is that high-functioning autistics, who might otherwise fair reasonably well in the world on their own, are hampered by a tendency toward misunderstanding and boring others.
The ESP device can prompt autistic people, who are prone to monologues or repetitive behavior, to ask questions, or give the listener a chance to participate in conversation. The hope is that with long-term use of the device as a self-teaching tool, ASC patients will eventually learn how to read for themselves the emotional responses in others.
According to data released by the Centers for Disease Control and Prevention, one in 166 children have some form of autism. These cases range from mild to severe, and the prevalence of the disorder among the population has been on the rise since the 1980s.
The computer data-sets of the ESP, which are currently programmed to detect six emotional state standards, can be readjusted for cultural differences in facial expression patterns, or updated with new information personalized to the wearer.
However, there are some challenges. The ESP computer does not yet take eye movement into account, and a better digital camera is needed so that the device could more easily fit inside a baseball cap or clip to a pair of glasses.
According to Picard, the Affective Computing group has received human subject approval and will be teaming up with another MIT research group that works with high-functioning teenagers suffering from milder forms of autism such as Asberger's syndrome to further test the device. These teens will wear the ESP when working with therapists, and eventually out in the world, to improve their social interaction, communication and repetitive patterns.
"People keep asking me if it can detect flirting," El Kaliouby joked. "I could program it to do that. Maybe it could be used for dating."