High dynamic range () is quite the buzzword. There's been an explosion of "4K HDR" TV sets to choose from, along with HDR TV shows and movies , Amazon and others.
Meanwhile there's "another" HDR with which most people are much more familiar. It's a photography process that's been around for years, . And now it's received a new infusion of publicity thanks to the "HDR+" feature on Google's new Pixel phone.
TV HDR and photo HDR have similar goals, but go about it quite differently. Here's everything you need to know about the full range, from Google to Netflix.
Display vs. capture
Both versions of HDR aim to do the same thing: make the digital version of reality more like what your eye sees in real reality. Both push the limits of what current technology can do, to make it look more realistic.
HDR for TVs is essentially a display process. It refers to a TV's ability to recognizeand display it in a way "normal" TVs can't.
Don't worry, we'll dive into both of these descriptions in a moment.
Since a TV is a display device and a camera is a capture device, the difference makes sense, but the use of the same term for both is still confusing. Still, this capture vs. display difference provides a convenient way to think about HDR as an overall concept.
The TV version of HDR, though newer, may be a little easier to understand. The TV makes the bright parts of the image really bright, while keeping the dark parts dark. This range between light and dark, also known as the, is supposed to be greater on HDR-capable TVs than on standard TVs.
In its simplest state, it means a brighter TV, but only in the areas on screen that need it. The result is an image that really pops and looks more like what you'd see in the real world. In addition, there's potentially more data available for more detail in the bright and dark parts of the image (which we'll talk about later).
LCD TV is with -- preferably the variety. One of the prototype TVs to demonstrate its HDR technology, with its 18,000 individually addressable LEDs, is basically an LCD TV on steroids.is the dominant TV technology today, and the best way to achieve these bright peaks and dark blacks on an
OLED TVs don't get as bright as LCDs, althoughbecause of its capability to produce a perfect shade of black.
For what it's worth, theindustry standard gives allowances to both technologies.
Ideally, HDR TVs will be fed specialized HDR content, either from UHD Blu-rays, Netflix, Amazon and so on.
There's also a couple of competing HDR standards out there: HDR10 and Dolby Vision. Most HDR TVs support HDR10, some support both.
That's the basics. For more detail, check out our explainer: How HDR works. It's a regular TV with enhanced performance. Think a Ferrari equipped with two big turbos. Or if you're into the classics, the Shelby Cobra vs. the AC Ace.
Photo HDR has been around for a few years, and probably more familiar, especially if you like to play around with your phone's camera settings.
A camera sensor (and the rest of the processing involved) can only capture a limited range of light at one time. Getting really bright objects, like the sun, at the same time as objects in shadow, is really difficult. The cheaper, worse or older the camera sensor, the less "range" it has (generally) to capture everything in one image.
To create an image with a greater range from light to dark, HDR in cameras capture the same image at multiple exposures. In a typical two-shot HDR process, one exposure captures the bright information, the other captures the dark info. These are combined using processing, either in the camera or after, via software like Photoshop. Many cameras use up to six shots to produce HDR images in-camera, a process referred to as "multishot HDR."
Most mobile phones and many cameras have an HDR feature built in. At the very least, it lets you capture a difficult scene that normally would be blown out or underexposed, depending how you set the exposure. It could be used to bring out some details in the shadows in a picture with bright sunlight, for example, or bring out details in the clouds near the sun.
With many different exposures and heavier-handed editing, HDR can bring a hyperrealism to photos. I'm not normally a fan, but photographers such as Trey Ratcliff take incredible photos that use HDR to great effect.
Even more detail
Let's take a look at this photo (the same one as at the top of this article, but down here again for easy comparison). Here, below, is the darkest exposure. Notice how the sun and clouds look great.
Here's the brightest exposure. Notice how you can see the beach, but the sun is blown out.
Now here they are combined as an HDR photo:
This is the limitation of most mainstream camera sensors. You can't have extreme brightness at the same time as shadows. Using Photoshop and Lightroom, I combined these to create the one image above that has both the shadow detail and the bright details. This is more like what I saw when I was standing there.
The professional-level sensors used in motion picture cameras can capture higher dynamic range images than current TVs are capable of reproducing. The idea behind HDR TVs (and HDR content) is to allow those images to be seen in the home.
TV HDR aims to actually expand the dynamic range of what you're seeing, not just enhancing. It's not doing processing to "enhance" the image. If you were to view the dark photo above on an HDR TV, for example, the sun would be very bright, with the dark parts very dark. The same image on a non-HDR TV would look flat by comparison, with less punch in the bright areas.
In addition, with 10-bit LCD panels 8-bit TV system., there will be extra gradations available. So for the above image, there could be more steps available in the bright parts. That means more detail in the areas above that look blown out now. Perhaps not quite as much detail as what's possible in the darker exposure image, but more than what we have now with our
Google's HDR+ and picture processing vs. 'native' HDR
The new Google Pixel has a camera technology called HDR+. It works largely the same as other HDR modes in phones,. It expands the dynamic range of the photo by preshooting and under-exposing images before you take the photo.
What does that mean? Let's put this in a slightly simpler way, with some arbitrary but easier to understand numbers. Let's say you want to take a picture of this: 123456789.
In this example "1" is dark, like the shadow under a car during the day, and "9" is the sunlight reflecting off the car's chrome. In real life, your eye could see both of those things at the same time, with minimal difficulty. In other words, your eye can see 123456789 all at once.
Cameras, though, can't. They don't have the dynamic range of your eye. So they have to pick and choose. Let's say the camera can show 5 of these numbers. Does it show 12345, making everything bright in the image look like a white blob? Or does is show 56789, making everything in shadow look like a black blob? Of course the answer is it will show 34567 or 45678, so some highlights are lost, and some shadow detail is lost, but the overall image at least looks something like what you shot.
Better cameras have better dynamic range: a mediocre phone camera might only be able to show 4567 while a great dSLR might be able to show 234567.
HDR+ is clever. Once you open the camera app, it's running. So when you finally press the button to take a picture, it has 15-20 seconds worth of exposures already stored in the camera's memory. Those images are actually underexposed. It takes them, combines them with the moment you picked, to create an image with greater dynamic range.
To use our example, let's say the camera itself captures 5678. The HDR+ processing also captured 4567. It combines the two images, applies some smart processing, and now you have an image that is almost-but-not-quite 45678.
The ultra-processed HDR made popular on Instagram and shown in the "Even more details" section above? Those are multiple exposures, so "2345", "4567" and "6789". That image doesn't look like reality, though, does it? Instead of 123456789 it's more like 13579. You have details that were there, and were perhaps not captured by a "normal" photo, but not really a greater dynamic range since you can still see it on your non-HDR monitor.
How does this compare to "true" HDR? The cameras used to make big Hollywood movies are really expensive and use sensors that are significantly better than what you can get in a consumer camera. They can capture, in our example, something like 2345678 or maybe even 12345678.
HDR TVs can show, when given real HDR content, 345678. Non-HDR TVs can show 4567 or maybe 34567.
Or here's the ultimate way to think about all this. An HDR photo can be seen, in its entirety, on your current computer monitor or phone screen. True HDR content and HDR TVs can't be visualized on your current screen, nor have pictures taken of them (accurately) because they go beyond what your current screen is capable of.
Photo HDR is cool, or at least, can be cool. In the right hands, it can create images that aren't possible with the limited range of modern camera (especially phone camera) sensors. It can also be applied as an after-effect or filter for decidedly unnatural images.
TVs with HDR offer an actual expanded range compared to their non-HDR counterparts, especially when provided with actual HDR content. Of course, TV makers will also likely use "HDR" processing to "improve" regular content, and again the effect might be noting like the director intended.
In other words, TV HDR isn't going to be the next artificial-looking. If done right, it will bring TV images one step closer to reality, much like good HDR in cameras can do today.
Got a question for Geoff? First, check out all the other articles he's written on topics like why all HDMI cables are the same, , and more. Still have a question? Send him an email! He won't tell you what TV to buy, but he might use your letter in a future article. You can also send him a message on Twitter @TechWriterGeoff or Google+.