Software can tell if you're mean and ugly

Researchers create a tool they say can predict character traits such as dominance and meanness with an accuracy above 90 percent. It can also tell you where you land on the scale of perceived attractiveness.

faces
The researchers used these images of public figures to test the prediction system. Images are sorted in increasing rank order from left to right, by dominant (top row), threatening (middle row), and attractive (bottom row). To be fair, Angelina is not wearing lipstick. PLoS One

If you're having a bad hair/skin/teeth/nose day, the last thing you probably need is software to tell you you're unattractive.

Yet that's precisely what a computational tool detailed today in the journal PLoS One promises to do. Using machine-learning techniques, it also examines images of faces for other social traits, such as competence, trustworthiness, meanness, dominance, and extroversion.

Needless to say, the software can't scientifically gauge your hotness or how likely you are to pay back a loan. It can only measure how your particular eye shape and grimace might be perceived and interpreted, a reaction that can vary from culture to culture depending on a host of factors.

Facial recognition, of course, is being used for everything from photo tagging to law enforcement and computer logins these days. This software takes the practice a step further in a high-tech continuation of research aimed at connecting facial shape and features to personality and character.

For example, "the perception of dominance has been shown to be an important part of social roles at different stages of life, and to play a role in mate selection," said Mario Rojas, a researcher from the Autonomous University of Barcelona who worked on the project with a team from Princeton University. If the information on which such evaluations are made could be automatically learned, he said, it could be modeled and used as a tool for designing better interactive computer systems.

Related stories
• 'Social X-ray specs' help us read emotions
• EmotionML: Will computers tap into your feelings?
• Software might know if you're depressed
• Algorithm spots sarcasm--suuuuure it does

The team trained and tested their algorithm on a set of synthetic facial images from a previous study. Subjects were asked to describe and rate the images, and those results were used to generate digital visages, each associated with a specific trait.

The researchers then used a subset of those images, together with their assigned labels, to "teach" the computer how to read faces, and tested the prediction accuracy using the rest of the images. They found three traits--dominant, threatening, and mean--to be predictable with accuracies between 91 percent and 96 percent.

The study goes into great detail (PDF) about precisely which facial actions correlate with which traits. Somewhat unsurprisingly, they found that the area around the eyes, including alignment, size, and distance, contains more information about attractiveness and judgment, while the area around the mouth, specifically the size of the lips, is more linked to extroversion.

They then challenged their program's predictive ability on the faces of celebrities, finding their results to be highly consistent with the public perception of these people (if you agree that Cameron Diaz looks less dominant than Data from "Star Trek" but beats Angelina Jolie and Albert Einstein on the hotness scale, that is).

Left, attractiveness is being measured. Right: extroversion. The size and color of the circles is proportional to the number of times a given point is used in a specific feature of the geometric descriptor, with small dark blue circles representing a low correlation. PLoS One

 

Join the discussion

Conversation powered by Livefyre

Show Comments Hide Comments