The iPhone X is the latest phone with a dedicated portrait mode that emulates bokeh, that blurred background effect mostly associated with dSLR cameras. Apple introduced it with the iPhone 7 Plus, Google did it on the Pixel 2 and Samsung has a similar Live Focus feature on its .
The iPhone X, or any phone for that matter, can't replace a dSLR. But we wanted to find out if the camera on Apple's newest phone can even come close to the real thing.
How they create that "portrait mode"
The photographer can manipulate focus, aperture and distance of the lens from the subject on a dSLR to control depth of field. A shallow depth of field makes the subject look sharp relative to the background.
The iPhone uses a combination of software and hardware to achieve a similar effect. It makes a depth map with its dual cameras to separate the subject from the rest of the scene, then blurs out what it perceives as the background. This is why it doesn't always get it right and often blurs out parts of the foreground or keeps parts of the background in focus.
On a standalone camera, shooting with a wide aperture will generally create bokeh -- the out-of-focus areas in a photo beyond the depth of field. The size of the sensor and the type of lens are other important components that play into how bokeh is rendered. You might have seen photos of bokeh lights where the circles of lights in the background look round, bright and smooth. Cameras with smaller sensors (like point-and-shoots or phones) can have trouble rendering these highlights.
To the test
We took some portraits using the iPhone X and a Canon 5D Mark III dSLR with a 50mm f/1.8 lens, stopping down to an aperture similar to that on the iPhone's telephoto lens.
Take a look for yourself at the results. The photos displayed are compressed on our site, so are not an accurate reproduction of the actual files the cameras can produce.
Which is which?
To us, the differences between the two were obvious -- we took the shots after all. But for others, the difference wasn't as apparent. We showed the results to some of our CNET colleagues and didn't tell them which one was shot on which device.
Most had a hard time identifying which one was which in the examples above. The answer? The image on the left is from the dSLR and the image on the right is from the iPhone.
Here's an important caveat: the photos from the iPhone were taken using the default camera app using portrait mode. The phone processes the images, including sharpening and color corrections, before rendering the final JPEG. On the dSLR, we shot in raw on a neutral profile. Shooting in raw means you have the flexibility to process the image however you like and if you prefer punchier colors you can easily adjust them. (You can alsowith a third-party app.)
It wasn't until we increased the magnification of each shot that our colleagues spotted the identifying features. When you look closely, you'll notice that the blur around the edges of the iPhone X's look harsh and unnatural, especially around hair and anything with uneven texture.
Viewing photos at 100 percent magnification is not just something photographers do to "pixel peep". If you end up cropping a photo or changing parameters like exposure, you can end up seeing a lot of these issues more easily.
Where the iPhone can't keep up
The iPhone X produces pleasing portraits in optimal conditions, but where it can't keep up with the dSLR is in low light. In extremely low light the iPhone can't even activate portrait mode as you'll see below, whereas the dSLR is able to take shots regardless.
How close is the iPhone X?
In ideal conditions, the phone mimics the bokeh effect well and Portrait Mode has come a long way since the iPhone 7 Plus. If you're posting on social media or viewing on a phone screen, the shot may look pretty close to what you would get on a dSLR, especially if you're viewing photos at a reduced magnification. But if you look closely, you can see where the processing still needs to improve.