How Microsoft showed me what happiness really is

Technically Incorrect: Microsoft's Project Oxford claims to identify the real emotions behind pictures of faces. So I experimented with it and discovered I'm not who I thought I was.

Chris Matyszczyk
3 min read

Technically Incorrect offers a slightly twisted take on the tech that's taken over our lives.


This is what 60 percent happy looks like.

Business Presence/Microsoft; screenshot by Chris Matyszczyk/CNET

I get the occasional email from readers. It's the same email, more or less.

It reads: "I wish you'd stop using that profile picture from when you were in prison." It's just an orange shirt. Truly.

I finally did something about it. I went to the astounding professionals at Business Presence.

They ask you to give them three words that you want your profile picture to express (my first effort: "Who the hell?"). Then they scout for custom backgrounds, help you choose clothes and take hundreds of pictures from which you select the one that works for you.

Then along came Microsoft with its Project Oxford. This fascinating tool uses machine learning and emotion-recognizing technology to determine what your face is really projecting in images.

Microsoft demonstrated this tool in London at the company's Future Decoded Event on Wednesday. And what a deep tool it is.

I know this is the selfie-centered era, but I don't go around taking too many of those. I'm distantly related to Shrek. What, then, would Microsoft make of my new professional profile pictures?

The first revelation is that, despite my thinking that I project some form of subjective warmth (because that's what you're supposed to do in pictures, isn't it?) Microsoft says that in many of my favored pictures I project little but neutrality.

Neutrality? I ask you. Actually, we could ask the Technically Incorrect commenters.

The more pictures I entered into the very simple tool, the more it became clear that in pictures I'm devoid of anger, contempt, disgust, fear or sadness (fooling oneself is an art). My posing apparently operates only on the axes of happiness and neutrality. That's what too many years working in advertising can do to you.

Microsoft's machines concluded that the one picture that I've ended up using the most emits almost 39 percent neutrality and 60 percent happiness. Which is around 64 percent more neutrality and 72 percent more happiness than I normally emit.

As far as I remember, this was a picture the photographer caught when I was off-guard commenting on something faintly silly that the makeup expert had said.

Can it be that even when we think we're projecting, say, wryness, other people see something entirely different? Can it be that when we think we're fooling people, we're really not that good? Will we ultimately come to rely on machines to help us project the precise things we want to express?

You may want to try this absorbing tool on your own pictures. I'd also recommend you upload some pictures of, say, your lover. Naturally, I did this and naturally she has far more emotional breadth that I can evidently muster.

This experiment was a vast step forward from Microsoft's attempt earlier this year to see things in faces. The idea was to use similar software to tell someone's age from their face. In my case, the machine claimed I had no face at all.

Now I have a face that projects just two emotions, one of which isn't really much of an emotion at all.

By this time next year, I hope to have found some nuance. Happy nuance, that is.


And this is what 99.9 percent happy looks like.

Business Presence/Microsoft; screenshot by Chris Matyszczyk/CNET