CNET también está disponible en español.

Ir a español

Don't show this again

Ghostbusters: Afterlife trailer Best Nintendo Switch deals Wunderlist retirement date announced Loteria Google Doodle Wonder Woman 1984 Best phones of 2019

What is HDR for TVs, and why should you care?

High-dynamic range (HDR) TVs, movies and TV shows are here. Here's what you need to know.

technicolor-tour-1.jpg

HDR can deliver brighter highlights, as seen on the TV on the right. Keep in mind that on the non-HDR screen you're reading this on, the brighter highlights won't appear physically brighter as they would in real life.

Geoffrey Morrison

HDR, or high-dynamic range, is the current "must-have" TV feature. TVs that support it can usually offer brighter highlights and a wider range of color detail, for a punchier image overall.

HDR-compatible TVs are now very common. Nearly all midrange and high-end TVs have HDR. At the same time HDR TV shows and movies are becoming more common, both on streaming services like Netflix and Ultra HD Blu-ray disc. 

Is this new technology worth the hype? In two words: largely, yes. I am pretty jaded when it comes to new TV tech, and I'm really excited about HDR. 

  • HDR requires both a TV that supports it and special HDR content.
  • HDR images can achieve brighter highlights with more contrast.
  • Many HDR TVs also have wide color gamut, resulting in deeper, richer colors with content that supports it.
  • HDR on a budget HDR TV and HDR on an expensive HDR TV can look very different. With some budget TVs, HDR can even look worse than non-HDR.
  • Almost all HDR content today is also available in 4K resolution.
  • There are numerous HDR formats, including "generic" HDR (aka HDR10), Dolby Vision, HDR10+ and more.

What is high-dynamic range?

The two most important factors in how a TV looks are contrast ratio, or how bright and dark the TV can get, and color accuracy, which is basically how closely colors on the screen resemble real life (or whatever palette the director intends). This isn't just my opinion, but also that of nearly every other TV reviewer, people who have participated in multi-TV face-offs at stores and for websites/magazines, and industry experts like the Imaging Science Foundation.

If you put two TVs side by side, one with a better contrast ratio and more accurate color, and the other with just a higher resolution (more pixels ($1,030 at Amazon)), the one with greater contrast ratio will be picked by pretty much every viewer. It will look more natural, "pop" more and just seem more "real," despite having lower resolution. 

HDR expands the range of both contrast and color significantly. Bright parts of the image can get much brighter, so the image seems to have more "depth." Colors get expanded to show more bright blues, greens, reds and everything in between.

Wide color gamut (WCG) is along for the ride with HDR, and that brings even more colors to the table. Colors that, so far, were impossible to reproduce on any television. The reds of a fire truck, the deep violet of an eggplant, even the green of many street signs. You may have never noticed before that these weren't exactly how they looked in real life, but you sure will now. WCG will bring these colors and millions more to your eyeballs.

For a bunch of background info on how color works on TVs, check out Ultra HD 4K TV color, part I: Red, green, blue and beyond, and Ultra HD 4K TV color, part II: The (near) future.

23-lg-c9-series-oled-tv-oled65c9p

Dolby Vision is one of a handful of HDR formats.

Sarah Tew/CNET

Photo HDR isn't TV HDR

One of the most important things to know about HDR TVs is that TV HDR is not the same as photo HDR. Every article I've written about HDR has comments from people complaining about the hyper-realistic look common with HDR photography. These are two very different things that, unfortunately and confusingly, just happen to share the same name. Like football and football.

I wrote an entire article about the difference, but the main takeaway is that HDR for TVs is not a picture-degrading gimmick (akin to the soap opera effect). It is definitely not that.

TV HDR: Expanding the TV's contrast ratio and color palette to offer a more realistic, natural image than what's possible with today's HDTVs.

Photo HDR: Combining multiple images with different exposures to create a single image that mimics a greater dynamic range.

photo-hdr-example.jpg

Photo HDR: Taking two or more images (left and center) and combining them to show some aspects of both (right).

Geoffrey Morrison

HDR for TVs aims to show you a more realistic image, one with more contrast, brightness and color than before.

An HDR photo isn't "high-dynamic range" in this sense. The image doesn't have the dynamic range possible in true HDR. It's still a standard dynamic range image, it just has some additional info in it due to the additional exposures.

A TV HDR image won't look different the way a photo HDR image does. It merely looks better.

I hate to belabor the point, but due to the two processes sharing the same name, this understanding is really the biggest hurdle HDR faces. Those with an open mind might seek out HDR to find out what it is, and be blown away by a demo -- and the demos are amazing. Those convinced HDR isn't worth their time won't ever bother to see the demo and will poison the well (so to speak).

ciechartwith709and2020-and-p3.jpg

In this color gamut graph, the smallest triangle (circles at corners) is standard SDR color. The next two largest represent P3 and Rec 2020 color, both part of HDR.

Geoffrey Morrison/CNET/Sakurambo

How does it work?

There are two parts of the HDR system: the TV and the source.

The first part, the TV, is actually the easier part. To be HDR-compatible, the TV should be able to produce more light than a normal TV in certain areas of the image. This is basically just like local dimming, but to an even greater extent.

Tied in with HDR is wide color gamut, or WCG. For years, TVs have been capable of a greater range of colors than what's possible in Blu-ray or downloads/streaming. The problem is, you don't really want the TV just creating those colors willy-nilly. It's best left to the director to decide how they want the colors of their movie or TV show to look, not a TV whose color expanding process might have been designed in a few days 6,000 miles from Hollywood. More on this in a moment.

Of course, making TVs brighter and more colorful costs money, and some HDR TVs will deliver better picture quality than others. Just because a TV is HDR-compatible doesn't necessarily mean it's going to outperform non-HDR TVs. The only thing the HDR label really means is that the TV will be able to display HDR movies and TV shows. It has nothing to do with how well it can show those images.

The content is the hard part. To truly look good, an HDR TV needs HDR content. Fortunately, the amount of HDR content is growing fast. The major 4K streaming services like Netflix and Amazon both have HDR content. As do many others.

Another source of HDR is physical discs. Ultra HD Blu-ray is the latest physical disc format. You'll need a new 4K BD player to play these discs, but your current Blu-ray and DVDs will play on the new players. Most 4K Blu-ray discs have HDR as well.

HDR content (the key)

When a movie or TV show is created, the director and cinematographer work with a colorist to give the program the right look. Take the muted, cold color tones of Winterfell in Game of Thrones versus the richness and warmth in King's Landing. If you've been living in a cave without HBO or the Internet, here's what I mean:

game-of-thrones-cool-and-warm.jpg
HBO

It's entirely possible that if you were on set for these two scenes, they would have looked the same, color-wise. Post-production tweaking can imbue a scene with a certain aesthetic and feeling, just with color.

When making movies, the team is able to use the wide palette of the Digital Cinema P3 color space to create gorgeous teals, oranges and violets.

But then comes time to make these movies work on TV. In order to do that, that team essentially "dumbs down" the image, removing dynamic range and limiting color. They get it to look the way they want, given the confines of the HDTV system, and that limited version is what you get on Blu-ray or a download.

Now playing: Watch this: LG C9 OLED TV has the best picture quality ever
2:18

If your TV is set to the Movie or Cinema mode, this is approximately what you'll get at home. If you're in the Vivid or Dynamic mode, the TV will then exaggerate the colors as it sees fit. It's creating something that isn't there, because at the mastering stage, the director and her team had to take that all out. Is the "Vivid" version close to what they saw or what was in the theater? Doubtful, and there's no way to know since it's your TV's creation.

Thanks to the additional storage and transmission capacities of 4K BD and streaming video from Amazon, Netflix and others, additional data, called metadata, can be added to the signal. It tells HDR/WCG TVs exactly how they should look, exactly what deeper colors to show, and exactly how bright a given highlight, reflection, star, sun, explosion or whatever should be. This is a huge advancement in how we're able to see images on TVs.

One example of how this is down is Technicolor's Intelligent Tone Mapping tool for content creators. It's design to let creators more easily (as in, more affordably) create HDR content. I've seen it in action, and the results are very promising. This is a good thing, as it means it's not labor intensive to create HDR versions of movies and shows. If it took tons of time, and time equals money, then we'd never get any HDR content. This is just one example of the process.

What about cables and connectors?

You won't need new cables for HDR... probably. Even if you do need new cables, they're very inexpensive. Current High-Speed HDMI cables can carry HDR. Your source device (a 4K Blu-Ray player or media streamer, say) and the TV must be both be HDR-compatible, regardless of what cables you use. If you use a receiver, that too must be HDR-compatible, to be able to pass the signals from the source to the TV. 

If you've bought your gear in the last few years, it's probably HDR-compatible. If you're not sure, put the model number into Google with "HDR" after it and see what comes up. 

The next generation of HDMI connection is called 2.1, and it adds a number of new features, including some improvements to how HDR is handled. It's something to keep in mind for your next purchase, but it doesn't make your current gear obsolete and will largely be backward compatible (other than the new features). 

Bottom line

Most experts I've spoken frequently say something along the lines of "More pixels are cool, but better pixels would be amazing." Which is to say 4K and 8K resolutions are fine, but HDR and WCG are far more interesting. What we've seen, now that we've had a few generations of HDR TVs to sort out the bugs, is a general improvement in overall image quality, though perhaps not quite to the extent many of us (myself included) initially expected. That said, a well-performing HDR TV, showing HDR content, will look better than the TVs from just a few years ago. In some cases they are significantly brighter and with a much wider range of colors, which is quite a sight to see. Check out our reviews for which is the best TV right now.

If you're curious about how HDR works, check out the aptly named How HDR works.

Note: This article was originally published in 2015 but was updated in 2019 with current info and links.  


Got a question for Geoff? First, check out all the other articles he's written on topics like why all HDMI cables are the sameTV resolutions explainedLED LCD vs. OLED and more.

Still have a question? Tweet at him @TechWriterGeoff, then check out his travel photography on Instagram. He also thinks you should check out his best-selling sci-fi novel and its sequel