CNET también está disponible en español.

Ir a español

Don't show this again


What is HDR for TVs, and why should you care?

High dynamic range (HDR) TVs are here and so is the first HDR content, with more of both on the way. Is this next-gen TV technology worth getting in your next TV?

Geoffrey Morrison

HDR, or high dynamic range, is the next big thing in TVs.

We've been talking about it for several years, but HDR-compatible TVs are now far more common. Nearly all midrange and high-end TVs for 2017 have HDR, and HDR content is becoming more common, both on streaming services like Netflix and Ultra HD Blu-ray disc.

Is this new technology worth the hype? In two words: largely, yes. I am pretty jaded when it comes to new TV tech, and I'm really excited about HDR. And I'm not the only one.

What is high dynamic range?

The two most important factors in how a TV looks are contrast ratio, or how bright and dark the TV can get, and color accuracy, which is basically how closely colors on the screen resemble real life (or whatever palette the director intends). This isn't just my opinion, but also that of nearly every other TV reviewer, people who have participated in multi-TV faceoffs at stores and for websites/magazines, and industry experts like the Imaging Science Foundation and Joe Kane.

If you put two TVs side by side, one with a better contrast ratio and more accurate color, and the other with just a higher resolution (more pixels), the one with greater contrast ratio will be picked by pretty much every viewer. It will look more natural, "pop" more, and just seem more "real," despite having lower resolution. In other words, a 1080p resolution TV with excellent contrast and color beats a 4K resolution TV with average contrast and color every time.

HDR expands the range of both contrast and color significantly. Bright parts of the image can get much brighter, so the image seems to have more "depth." Colors get expanded to show more bright blues, greens, reds and everything in between.

Wide color gamut (WCG) is coming along for the ride with HDR, and that brings even more colors to the table. Colors that, so far, were impossible to reproduce on any television. The reds of a fire truck, the deep violet of an eggplant, even the green of many street signs. You may have never noticed before that these weren't exactly how they looked in real life, but you sure will now. WCG will bring these colors and millions more to your eyeballs.

For a bunch of background info on how color works on TVs, check out Ultra HD 4K TV color, part I: Red, green, blue and beyond, and Ultra HD 4K TV color, part II: The (near) future.

Photo HDR isn't TV HDR

One of the most important things to know about HDR TVs is that TV HDR is not the same as photo HDR. Every article I've written about HDR has comments from people complaining about the hyper-realistic look common with HDR photography. These are two very different things that, unfortunately and confusingly, just happen to share the same name. Like football and football.

I wrote an entire article about the difference, but the main takeaway is that HDR for TVs is not a picture-degrading gimmick (akin to the soap opera effect). It is definitely not that.

TV HDR: Expanding the TV's contrast ratio and color palette to offer a more realistic, natural image than what's possible with today's HDTVs.

Photo HDR: Combining multiple images with different exposures to create a single image that mimics a greater dynamic range.

Photo HDR: Taking two or more images (left and center) and combining them to show some aspects of both (right). Geoffrey Morrison

HDR for TVs aims to show you a more realistic image, one with more contrast, brightness and color than before.

An HDR photo isn't "high dynamic range" in this sense. The image doesn't have the dynamic range possible in true HDR. It's still a standard dynamic range image, it just has some additional info in it due to the additional exposures.

A TV HDR image won't look different the way a photo HDR image does. It merely looks better.

I hate to belabor the point, but due to the two processes sharing the same name, this understanding is really the biggest hurdle HDR faces. Those with an open mind might seek out HDR to find out what it is, and be blown away by a demo -- and the demos are amazing. Those convinced HDR isn't worth their time, won't ever bother to see the demo and will poison the well (so to speak).

How does it work?

There are two parts of the HDR system: the TV and the source.

The first part, the TV, is actually the easier part. To be HDR-compatible, the TV should be able to produce more light than a normal TV in certain areas of the image. This is basically just like local dimming, but to an even greater range.

Tied in with HDR is wide color gamut, or WCG. For years, TVs have been capable of a greater range of colors than what's possible in Blu-ray or HD downloads/streaming. The problem is, you don't really want the TV just creating those colors willy-nilly. It's best left to the director to decide how they want the colors of their movie or TV show to look, not a TV whose color expanding process might have been designed in a few days 6,000 miles from Hollywood. More on this in a moment.

Of course, making TVs brighter and more colorful costs money, and some HDR TVs will deliver better picture quality than others. Just because a TV is HDR-compatible doesn't necessarily mean it's going to outperform non-HDR TVs. The only thing the HDR label really means is that the TV will be able to display HDR movies and TV shows, not how well.

The content is the hard part. To truly look good, the HDR TV needs HDR content. Fortunately, the amount of HDR content is growing fast. The major 4K streaming services like Netflix and Amazon both have HDR content. As do many others.

Another source of HDR will be physical discs. Ultra HD Blu-ray is the latest physical disc format. You'll need a new UHDBD player to play these discs, but your current Blu-ray and DVDs will play on the new players. Not all UHDBD discs have HDR, but many do.

HDR content (the key)

When a movie or TV show is created, the director and cinematographer work with a colorist to give the program the right look. Take the muted, cold color tones of Winterfell in "Game of Thrones" versus the richness and warmth in King's Landing. If you've been living in a cave without HBO or the Internet, here's what I mean:


It's entirely possible that if you were on set for these two scenes, they would have looked the same, color-wise. Post-production tweaking can imbue a scene with a certain aesthetic and feeling, just with color.

When making movies, the team is able to use the wide palette of the Digital Cinema P3 color space to create gorgeous teals, oranges and violets.

But then comes time to make these movies work on TV. In order to do that, that team essentially "dumbs down" the image, removing dynamic range and limiting color. They get it to look the way they want, given the confines of the HDTV system, and that limited version is what you get on Blu-ray or a download.

If your TV is set to the Movie or Cinema mode, this is approximately what you'll get at home. If you're in the Vivid or Dynamic mode, the TV will then exaggerate the colors as it sees fit. It's creating something that isn't there, because at the mastering stage, the director and her team had to take that all out. Is the "Vivid" version close to what they saw or what was in the theater? Doubtful, and there's no way to know since it's your TV's creation.

Thanks to the additional storage and transmission capacities of 4K BD and streaming video from Amazon, Netflix and others, additional data, called metadata, can be added to the signal. It tells HDR/WCG TVs exactly how they should look, exactly what deeper colors to show, and exactly how bright a given highlight, reflection, star, sun, explosion or whatever should be. It can even adjust picture settings or put the TV in a certain picture mode automatically. This is a huge advancement in how we're able to see images on TVs.

Technicolor's Intelligent Tone Mapping is a tool for content creators to more easily (as in, more affordably) create HDR content. I've seen it in action, and the results are very promising. This is a good thing, as it means it's not labor intensive to create HDR versions of movies and shows. If it took tons of time, and time equals money, then we'd never get any HDR content. This is just one example of the process.

What about cables and connectors?

You won't need new cables for HDR... probably. Current High-Speed HDMI cables can carry HDR. The source device (a 4K Blu-Ray player, say) and TV must be at least HDMI 2.0a to transmit the metadata, however. If you have a receiver and want to use it for switching, it will need to be HDMI 2.0a as well.

If you bought a receiver or media streamer in the last few years, best to check with the manufacturer if your TV is HDR and you want to try it out. If your TV is HDR, the internal streaming apps should support HDR as well, so you can go that route too.

Bottom line

Most experts I've spoken to, on both the content side and the TV side, are excited about HDR and WCG. 4K itself didn't have anyone in those camps that excited. The common refrain was "More pixels are cool, but better pixels would be amazing."

Though breathlessly claimed as the next-generation TV evolution, 4K was anything but. Now, with HDR and WCG, we're looking at the promised evolution, and it should be a brighter and more colorful one.

Got a question for Geoff? First, check out all the other articles he's written on topics like why all HDMI cables are the same, TV resolutions explained, LED LCD vs. OLED, and more. Still have a question? Tweet him @TechWriterGeoff then check out his travel photography on Instagram. He also thinks you should check out his bestselling sci-fi novel and its sequel.