If you're taking a selfie on the new iPhone 10s, you might think it looks a little bit different from any other selfie you've taken before.
Especially when you compare it to last year's iPhone 10.
Some people are calling this a smoothing effect, or even a beauty mode.
And there is actually a difference between what the photos look like on the 10 S compared to the 10.
I've been taking a lot of different images on both the front and back cameras, and I'll tell you why they look different.
The first part is to do with how the 10 S takes photos through HDR and computational photography.
Now, HDR is High Dynamic Range, you've probably heard of this before, it's essentially taking multiple different exposures, either underexposed on the meter, or overexposed and then blending them together in order to form an image that has an increased dynamic range.
So this means that you'll get more detail in the shadow areas, and also more detail retained in the highlights.
So like from bright lights, or light coming in from a window.
So that's essentially what HDR is.
And because of that blending technique, some photographers have been able to do kinda cool things with this in order to make HDR look a lot different from a regular photo.
So they can make images look a little bit more hyper-real, or they can make them look a little bit more airbrushed, because of the way that they map those tones.
And the way that the shadow and highlight detail is distributed.
So that's one reason why images look a little bit different.
The 10S is doing a lot of this computational photography behind the scenes and blending exposures.
And it's important to know that Apple isn't the only company doing this on phones.
Google, Samsung, and many other manufacturers HGR and blended exposure modes in there, as well.
Now, Apple also has the addition of Smart HDR on the iPhone 10s, and this is Apple's Phil Schiller at the iPhone !0s launch event explaining how it works.
What the HL Bionic is actually doing is shooting a four-frame buffer so it can capture that critical moment.
But the A12 Bionic is doing even more than that.
It's also capturing secondary inter-frames at the same time.
And those inter-frames are shot at a different exposure level to bring up highlight details.
And it's doing more than that, it's shooting a long exposure so we can get better shadow detail as well.
And when you're taking that picture it's analyzing all of those Finding out how to match up the best parts of each and merge them into one perfect photo.
So here's a selfie I took to kind of show the difference between smart HDR on the iPhone XS, and not having smart HDR on the iPhone X.
As you can see, the window in the background on the X is pretty much completely blurred out.
There's not as much detail captured there.
On the 10S though on the other hand, even though it has a much more even spread of tones and shadows and highlight detail, does look a little bit smoother and because it looks like there's actually less contrast, your eye thinks that maybe there's less detail there and they fool us So that's the first reason that photos look a little bit different.
The next part is to do with noise reduction.
So, when you're taking a HDR image or blending multiple exposures together, you pretty much need to make sure that all these photos have been taken at the same time.
Otherwise, you're gonna introduce things like camera shake or subject shake into your images.
Now, to do that, a phone camera or any other camera.
Pretty much needs to rely on a really fast shutter speed.
And to do this, either in bright light or low light, you usually have to ramp up your ISO unless you're shooting in outdoor sunlight with a lot of ambient light there.
Now, by shooting with a really high ISO or higher than you normally would use, you usually introduce quite a lot of noise into your images.
And than you have to apply some sort of noise reduction otherwise the photos are gonna look really grainy and they're going to have speckles all over the image.
No one wants that all over their photos.
So here's an example that I shot on my DSLR ISO 3200.
[UNKNOWN] low light situation but you can see without any editing at all from the raw image there is a lot noise on this image.
If I apply some noise reduction to this in light room.
This isn't even an extreme example,but it shows you just how much detail can be smoothed out in order to remove that noise.
So that's kinda one of the by products of some noise reduction algorithm.
So when you're shooting photos on the XS especially selfies in low light obviously it's gonna have to apply a lot of noise reduction in order to get rid of that noise.
Otherwise you have a really grainy image.
So lot of this does come down to Personal preference.
A lot of people that I showed photos to on both phones liked the images from the ten S because they looked a little bit more even and they had a good spread of shadow and highlight detail, so it didn't look so dramatic.
But a lot of people liked the look from the ten because photos appeared to have a little bit more detail just because they had some more contrast.
And less of that HDR effect.
But it's totally up to you which one you like.
But the most important thing to know is that this is not just happening on faces, this is pretty much any subject specially if you're shooting in low light.
You will notice a difference.
One way this could be tweaked is if Apple introduces a software update that lets you adjust the granularity of maybe the HDR or the noise reduction or of course you can just shoot in raw using a third party app so you have full control over your images.
But I'd love to know which one you prefer and what you think.
I have an article explaining this in more detail on cnet.com.