X

​How Google's Pixel phone builds a better photo

The new Android phone blends multiple shots into one to generate photos that look more like what your own eyes see.

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
4 min read
Google Pixel​ XL in blue

The Google Pixel XL, a 5.5-inch Android phone, has a top-shelf camera.

Stephen Shankland/CNET

The new Google Pixel phone has a super camera, with snappy performance and image quality that beats Apple's iPhone 7 Plus overall. So how did Google do it?

A lot of the success comes from a technology Google calls HDR+ that blends multiple photos into one to overcome common problems with mobile phone photography . Knowing that photography is a top-three item for people buying a phone, Google invested heavily in the Pixel camera. HDR+ is a key part of that effort.

You may prefer Samsung Galaxy phones or Apple iPhones, but if you're a shutterbug looking for a new phone, HDR+ is a good reason to put Google's Pixel on your short list.

HDR+ is an example of computational photography, a fast-moving field that means creation of a photo doesn't stop when an image sensor turns light from a scene into digital data. Instead, computer chips add extra steps of processing. That's useful for reducing noise, correcting lenses' optical shortcomings and stitching a camera sweep into a single panoramic shot.

But HDR+ improves what's arguably a more noticeable part of image quality called dynamic range, the ability to photograph both dim shadows and bright highlights. Expensive cameras with large sensors are better at handling both, ensuring that pine needles don't disappear into a swath of black and details on a wedding dress don't blow out into a blaze of white. But because of the way small image sensors work, phone cameras struggle with dynamic range.

Google Pixel HDR+ before and after example
Enlarge Image
Google Pixel HDR+ before and after example

Google's HDR technology, used on the right image here, blends several underexposed frames into one final photo to boost dim areas while keeping unpleasant glare out of bright patches.

Stephen Shankland/CNET

Enter high dynamic range photography, or HDR. Plenty of cameras these days employ HDR techniques -- Apple has since the iPhone 4 back in 2010 -- but Google's HDR+ does so particularly well. The result is something that looks more like what your own eyes see.

HDR+ starts with the Pixel's ability to "circulate" a constant stream of photos through the phone's memory whenever the camera app is open, 30 per second when it's bright and 15 per second when dim. When you tap the shutter button, it grabs raw image data from the last 5 to 10 frames and gets to work, according to Tim Knight, leader of Google's Android camera team.

The key part of HDR+ is making sure highlights don't blow out into a featureless whitewash, a common problem with clouds in the sky and cheeks in sunlight.

"HDR+ wants to maintain highlights," Knight said "We're capturing all the data underexposed -- sometimes 3 to 4 stops underexposed," meaning that each frame is actually up to 16 times darker than it ought to look in a final photo. By stacking up these shots into a single photo, though, HDR+ can brighten dark areas without destroying the photo with noise speckles. And it can protect those highlights from washing out.

Pixel vs. iPhone 7 Plus camera comparison

See all photos

HDR+ predates the Pixel, but special-purpose hardware, Qualcomm's Hexagon chip technology, lets Google accelerate it on the Pixel. "Our goal was to maintain quality but improve speed," Knight said. "We met that goal."

Specifically, Google uses an open-source image-processing software project it calls Halide. It took Google two years to adapt Halide so it would run using the Hexagon technology.

HDR in general works better if you have good raw material to work with. Google chose a high-end 12-megapixel Sony IMX378 sensor with large pixels that are better able to distinguish bright from dark and to avoid image noise in the first place.

Google HDR+ halo example
Enlarge Image
Google HDR+ halo example

Google's HDR technology does a good job making sense of high-contrast scenes, but it can leave halos like the one that appears to make this bus glow a little against the blue sky.

Stephen Shankland/CNET

Another general HDR problem is ghosting, artifacts stemming from differences in the frames caused by moving subjects like running children or trembling tree leaves. Blurring from camera shake also can be a problem. Using artificial intelligence techniques, Google's HDR+ quickly analyzes the burst of photos to pick a "lucky shot" that serves as the basis for the final photo.

HDR and HDR+ in particular make camera processors work harder and therefore consume more battery power. And even with special-purpose processing chips, it can be hard to keep up. That's why Google doesn't use HDR+ when shooting video on the Pixel -- only some more modest adjustments to image tone.

It's not perfect. In my testing, HDR+ can sometimes leave photos looking underexposed, some naturally bright colors can be muted and bright-dark contrast areas sometimes suffer halos that can, for example, make a tree look like it's glowing against a darker blue sky background.

But in general HDR+ on the Pixel does well with overcast skies, backlit faces, harsh sunlight and other challenges. And because it's software, Google can update its camera app to improve HDR+, something it's done with earlier Nexus phones.

Blending multiple shots into one, done right, is a good recipe for success.

First published October 20, 5 a.m. PT.

Update, October 21 at 11:17 a.m.: Adds detail on why HDR+ isn't offered with video.