LONDON -- It's the look in his eyes, the bemused expression on her face. That's how you, the human, get a sense of how someone else is feeling.
Now Microsoft says it can get a computer to pick up on those same cues.
The company's Project Oxford artificial intelligence team has concocted emotion-reading technology that uses knowledge of facial expressions to identify specific human feelings conveyed in photos.
The tool was on display at Microsoft's Future Decoded event here Wednesday, and it has been released in beta to developers. It trains computers to recognize eight core emotional states: anger, contempt, fear, disgust, happiness, neutral, sadness and surprise.
Microsoft Chief Executive Satya Nadella spoke a day earlier at the event about how he wants to create alternative inputs and outputs to. This tool is an example of what one of these inputs could look like. With the ability to interpret a user's mood, a device could interact with that user in a whole new way.
As of now, the tool seems less than foolproof and doesn't have the capacity to recognize the full spectrum of human emotion. The potential is there, however. Chris Bishop, head of Microsoft Research Cambridge, demoed how the tool can be used to track multiple people and multiple emotions at once, as well as how each emotion is registered on a scale of zero to one.
Microsoft gives several examples of how it envisions the tech being implemented in a real-world context. Marketers might use it to gauge the reactions of customers, or it could be baked into a messaging app that can interpret emotions in images.
The human emotion reader is one of several Project Oxford tools capable of interpreting words, sounds or images that are set to be released to developers in beta before the end of the year. Microsoft hopes developers who don't necessarily have expertise in machine learning or artificial intelligence will be able to use the tools to build such features into their apps.