X
CNET logo Why You Can Trust CNET

Our expert, award-winning staff selects the products we cover and rigorously researches and tests our top picks. If you buy through our links, we may get a commission. Reviews ethics statement

Galaxy S23 Ultra and Rival Phones Use This Tech for Better Photos

Top-tier smartphones like the Samsung Galaxy S23 and iPhone 14 Pro rely on pixel binning for good photos when it's dark or bright. Here's how it works.

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
8 min read
The back of Samsung's Galaxy S22 phone rests atop the back of an Apple iPhone 14 Pro.

The iPhone 14 Pro (left) and the Galaxy S22 Ultra use pixel binning on their main sensors to open up new photo quality options.  

Andrew Lanxon/CNET

Megapixels used to be so much simpler: A bigger number meant your camera could capture more photo detail as long as the scene had enough light. But a technology called pixel binning that's now universal on flagship smartphones -- including in the Samsung Galaxy S23 announced Wednesday -- is changing the old photography rules for the better. 

In short, pixel binning gives you a camera that offers lots of detail when it's bright out, without becoming useless when the light is dim. It lets phone makers offer camera options with new zoom levels, too, like 2X telephoto and wider-angle 8K video.

Pixel binning arrived in 2018, spread widely in 2020 with models like Samsung's Galaxy S20 Ultra and Xiaomi's Mi 10 Pro, and arrived on Apple and Google hardware with the iPhone 14 Pro and Pixel 7 phones in 2022. Pixel binning let Samsung cram a 108-megapixel main camera sensor into 2022's Galaxy S22 Ultra and a new 200-megapixel main camera into the S23 Ultra that debuted at the Samsung Unpacked 2023 launch event.

The necessary hardware changes bring some tradeoffs and interesting details, though, and different phone makers are trying different pixel binning recipes. Here's a closer look.

Read more: Check out CNET's Google Pixel 7 Pro reviewiPhone 14 Pro review and Galaxy S23 Ultra first look

What is pixel binning?

Pixel binning is a technology that's designed to make an image sensor more adaptable to different conditions by grouping pixels in different ways. When it's bright you can shoot at the full resolution of the sensor, at least on some phones. When it's dark, sets of pixels — 2x2, 3x3, or 4x4, depending on the sensor — can be grouped into larger virtual pixels that gather more light but take lower-resolution shots.

For example, Samsung's Isocell HP2 sensor can take 200-megapixel shots, 50-megapixel shots with 2x2 pixel groups, and 12.5-megapixel shots with 4x4 pixel groups.

Pixel binning offers another advantage that arrived in 2020 phones: virtual zoom. Phones can crop a shot to only gather light from the central pixels on the iPhone 14 Pro's 48-megapixel main camera or the Google Pixel 7's 50-megapixel camera. That turns a 1x main camera into 2x zoom that takes 12-megapixel photos. It'll only work well with relatively good light, but it's a great option, and 12 megapixels is the prevailing resolution for years now, so it's still a useful shot.

With such a high base resolution, pixel binning sensors also can be more adept with high-resolution video, in particular at extremely high 8K resolution.

Pixel binning requires some fancy changes to the sensor itself and the image-processing algorithms that transform the sensor's raw data into a photo or video.

Is pixel binning a gimmick?

No. Well, mostly no. It does let phone makers brag about megapixel numbers that vastly exceed what you'll see even on professional-grade DSLR and mirrorless cameras. That's a bit silly, since the larger pixels on high-end cameras gather vastly more light and feature better optics than smartphones. But few of us haul those big cameras around, and pixel binning can wring more photo quality out of your smartphone camera.

How does pixel binning work?

To understand pixel binning better, you have to know what a digital camera's image sensor looks like. It's a silicon chip with a grid of millions of pixels (technically called photosites) that capture the light that comes through the camera lens. Each pixel registers only one color: red, green or blue.

The colors are staggered in a special checkerboard arrangement called a Bayer pattern that lets a digital camera reconstruct all three color values for each pixel, a key step in generating that JPEG you want to share on Instagram.

Samsung pixel binning diagram

This diagram shows how the image sensor on the Samsung Galaxy S20 Ultra's 108-megapixel camera has 3x3 pixel groups to enable pixel binning. The technology lets a camera take either high-resolution photos when it's bright or lower-resolution shots in dimmer light.

Samsung

Combining data from multiple small pixels on the image sensor into one larger virtual pixel is really useful for lower-light situations, where big pixels are better at keeping image noise at bay and capture color better. When it's brighter out, there's enough light for the individual pixels to work on their own, offering the higher-resolution shot or a zoomed-in view.

Pixel binning commonly combines four real pixels into one virtual pixel "bin." But Samsung's Galaxy S Ultra line has used a 3x3 group of real pixels into one virtual pixel, and the South Korean company is likely to adopt 4x4 binning with the Galaxy S23 Ultra.

When should you use high resolution vs. pixel binning?

Most people will be happy with lower-resolution shots, and that's the default my colleagues Jessica Dolcourt and Patrick Holland recommend after testing the new Samsung Galaxy phones. Apple's iPhones won't even take 50-megapixel shots unless you specifically enable the option while shooting with its high-end ProRaw image format, and Google's Pixel 7 Pro doesn't offer full 50-megapixel photos at all.

The 12-megapixel shots offer better low-light performance, but they also avoid the monster file sizes of full-resolution images that can gobble up storage on your device and online services like Google Photos and iCloud. For example, a sample shot my colleague Lexy Savvides took was 3.6MB at 12 megapixels with pixel binning and 24MB at 108 megapixels without.

Photo enthusiasts are more likely to want to use full resolution when it's feasible. That could help you identify distant birds or take more dramatic nature photos of distant subjects. And if you like to print large photos (yes, some people still make prints), more megapixels matter.

Does a 200-megapixel Samsung Galaxy S23 Ultra take better photos than a 61-megapixel Sony A7r V professional camera?

No. The size of each pixel on the image sensor also matters, along with other factors like lenses and image processing. There's a reason the Sony A7r V costs $3,898 while the S23 Ultra costs $1,200 and can also run thousands of apps and make phone calls.

Image sensor pixels are squares whose width is measured in millionths of a meter, or microns. A human hair is about 75 microns across. On Samsung's Isocell HP2, a virtual pixel on a 12-megapixel shot is 2.4 microns across. In 200-megapixel mode, a pixel measures just 0.6 microns. On a Sony A7r V, though, a pixel is 3.8 microns across. That means the Sony can gather two and a half times more light per pixel than a phone with the HP2 Ultra with 12-megapixel binning mode, and 39 times more than in 200-megapixel full-resolution mode — a major improvement in image quality.

Phones are advancing faster than traditional cameras, though, and closing the image quality gap. Computational photography technology like combining multiple frames into one shot and other software processing tricks made possible by powerful phone chips are helping, too. That's why my colleague and professional photographer Andrew Lanxon can take low-light smartphone photos handheld that would take a tripod with his DSLR. And image sensors in smartphones are getting bigger and bigger to improve quality.

Why is pixel binning popular?

Because miniaturization has made ever-smaller pixels possible. "What has propelled binning is this new trend of submicron pixels," those less than a micron wide, said Devang Patel, a senior marketing manager at Omnivision, a top image sensor manufacturer. Having lots of those pixels lets phone makers — desperate to make this year's phone stand out — brag about lots of megapixel ratings and 8K video. Binning lets them make that boast without sacrificing low-light sensitivity.

Can you shoot raw with pixel binning?

That depends on the phone. Photo enthusiasts like the flexibility and image quality of raw photos — the unprocessed image sensor data, packaged as a DNG file. But not all phones expose the raw photo at full resolution. The iPhone 14 Pro does, but the Pixel 7 Pro does not, for example.

The situation is complicated by the fact that raw processing software like Adobe Lightroom expects raw images whose color data comes in a traditional Bayer pattern, not pixel cells grouped into 2x2 or 3x3 patches of the same color.

The Isocell HP2 has a clever trick here, though: it uses AI technology to "remosaic" the 4x4 pixel groups to construct the traditional Bayer pattern color checkerboard. That means it can shoot raw photos at full 200-megapixel resolution, though it remains to be seen whether that will be an option exposed in shipping smartphones.

What are the downsides of pixel binning?

For the same size sensor, 12 real megapixels would perform a bit better than 12 binned megapixels, says Judd Heape, a senior director at Qualcomm, which makes chips for mobile phones. The sensor would likely be less expensive, too. And when you're shooting at full resolution, more image processing is required, which shortens your battery life.

Indeed, pixel binning's sensor costs and battery and processing horsepower requirements are reasons it's an option mostly on higher-end phones.

For high-resolution photos, you'd get better sharpness with a regular Bayer pattern than with a binning sensor using 2x2 or 3x3 groups of same-color pixels. But that isn't too bad a problem. "With our algorithm, we're able to recover anywhere from 90% to 95% of the actual Bayer image quality," Patel said. Comparing the two approaches in side-by-side images, you probably couldn't tell a difference outside lab test scenes with difficult situations like fine lines.

If you forget to switch your phone to binning mode and then take high-resolution shots in the dark, image quality suffers. Apple automatically uses pixel binning to take lower-resolution shots, sidestepping that risk.

Could regular cameras use pixel binning, too?

Yes, and judging by some full-frame sensor designs from Sony, the top image sensor maker right now, they someday do that.

What's the future of pixel binning?

Several developments are possible. Very high-resolution sensors with 4x4 pixel binning could spread to more premium phones, and less exotic 2x2 pixel binning will spread to lower-end phones.

Omnivision pixel binning future diagram

Sensor maker Omnivision shows how 2x2 pixel binning (lower left) can be used to create larger virtual pixels (second row, top) or re-create a traditional Bayer checkerboard pattern (second row, bottom). It also can be used to create HDR images (third row) or to improve autofocus with larger microlenses (fourth row). 

Omnivision

Another direction is better HDR, or high dynamic range, photography that captures a better span of bright and dark image data. Small phone sensors struggle to capture a broad dynamic range, which is why companies like Google and Apple combine multiple shots to computationally generate HDR photos.

But pixel binning means new pixel-level flexibility. In a 2x2 group, you could devote two pixels to regular exposure, one to a darker exposure to capture highlights like bright skies, and one to a brighter exposure to capture shadow details.

Indeed, Samsung's HP2 can divvy up pixel duties this way for HDR imagery.

Omnivision also expects autofocus improvements. With earlier designs, each pixel is capped with its own microlens designed to gather more light. But now a single microlens sometimes spans a 2x2, 3x3, or 4x4 group, too. Each pixel under the same microlens gets a slightly different view of the scene, depending on its position, and the difference lets a digital camera calculate focus distance. That should help your camera keep the photo subjects in sharp focus.