Einstein bot: E = mc smile

Researchers from UC San Diego rely on developmental psychology and feedback from real-time facial expression recognition to teach the Einstein robot to make a series of complex facial movements.

Albert Einstein: A robot of many moods. Flickr/Erik Jepsen, UC San Diego

Albert Einstein has come back to life in the form of a robot with a bushy mustache and a highly expressive face. Especially noteworthy is that rather than requiring manual programming, robo-Einstein has taught itself to smile, frown, and grimace.

Researchers from the University of California at San Diego relied on developmental psychology and feedback from real-time facial expression recognition to teach the bot to form a series of complex expressions. In an era when robot faces are becoming increasingly realistic (and sometimes downright eerie ), the scientists believe their work (PDF) could help circumvent the costly need for human recalibration of robots. It could also offer insight into how infants learn to make facial expressions.

Psychologists speculate that babies learn to control their bodies through systematic exploratory movements, including babbling to learn to speak. The scientists at UCSD's Machine Perception Laboratory applied the same idea to teaching their Einstein robot to form realistic facial expressions.

The Einstein robot head, which was created by Hanson Robotics, is covered in a material called Frubber and has about 30 facial muscles, each moved by a tiny servo motor connected to the muscle by a string.

The researchers directed the robot head to twist and turn its face in all directions while analyzing its own facial expressions using a video camera and software called CERT (Computer Expression Recognition Toolbox) developed at UC San Diego. This process, which took place on an Intel-based Mac Mini, essentially taught the bot the correlation between facial expressions and the muscle movements required to make them. It then could build on that know-how to form its own expressions.

Was this what it looked like inside the real Einstein's head? Given his intellect, it just might have. Flickr/Erik Jepsen, UC San Diego

For example, the robot learned eyebrow narrowing, which requires the inner eyebrows to move together and the upper eyelids to close a bit to narrow the eye aperture.

"As far as we know, no other research group has used machine learning to teach a robot to make realistic facial expressions," said Tingfan Wu, a computer science Ph.D. student from the UC San Diego Jacobs School of Engineering and one of the researchers on the project.

The researchers note that some of the bot's learned facial expressions still appear awkward (see the video after the jump, where it looks like Einstein is suffering from allergies). They say their model may currently be too simple to describe the coupled interactions between facial muscles and skin that can produce thousands of expressions in the typical non-Frubber face.

The team says it's currently working on a more accurate facial expression generation model--and, presumably, a way to teach robo-Einstein to smile and come up with cosmological theories simultaneously.

 

Join the discussion

Conversation powered by Livefyre

Don't Miss
Hot Products
Trending on CNET

HOT ON CNET

Love heavy and clunky tablets?

Said no one ever. CNET brings you the lightest and thinnest tablets on the market.