Can you read your robot's emotional state?
Researchers at Georgia Tech have found age-related differences in the way adults perceive the emotional state of a virtual robot interaction partner. Huh?
If you can't determine the emotional states expressed by this virtual robot, chances are you might be an older adult, according to a study by Georgia Tech.
You might also have trouble serving our future robot overlords. But I digress.
In a rather strange study, researchers in the school's Human Factors and Aging Laboratory tested people's ability to gauge the emotional state of a robot by presenting them with a virtual feline displaying seven emotional states at various levels of intensity: happiness, sadness, anger, fear, surprise, disgust, and neutrality.
The groups consisted of adults between the ages of 65 and 75 and teens and adults between the ages of 18 and 27. The researchers found that the older cohorts had more difficulty recognizing anger, fear, and happiness in the robot cat, confusing happiness with its neutral state.
But problems with programming the robot to express an accurate representation of emotion might be the cause of the discrepancy between age groups. We may indeed be able to read robot "emotions," but only if they simulate our own feelings well enough.
While you might wonder why anyone would be studying robot emotions in the first place--especially when the robocat makes such simplistic expressions--the researchers believe that if robots are going to become commonplace in our society, we must be able to read their faces well to get along with them.