There’s a new high dynamic range format coming. It’s called HDR10+. Here’s what you need to know.
Geoffrey Morrison is a writer/photographer about tech and travel for CNET, The New York Times, and other web and print publications. He's also the Editor-at-Large for The Wirecutter. He has written for Sound&Vision magazine, Home Theater magazine, and was the Editor-in-Chief of Home Entertainment magazine. He is NIST and ISF trained, and has a degree in Television/Radio from Ithaca College. His bestselling novel, Undersea, and its sequel, Undersea Atrophia, are available in paperback and digitally on Amazon. He spends most of the year as a digital nomad, living and working while traveling around the world. You can follow his travels at BaldNomad.com and on his YouTube channel.
HDR is the latest enhancement to make it to televisions, streaming devices and your favorite TV shows and movies. The good news is that it can deliver the best picture quality available in home video today. The bad news? It comes in multiple formats, adding one more potential point of confusion to the already overwhelming TV buying process.
We already have HDR10, Dolby Vision and HLG, but apparently that's not enough. Now there's a new one, HDR10+, the "+" being the key. Created by Samsung, HDR10+ has recently gotten some other big-name backers, like Panasonic, Philips Amazon and 20th Century Fox.
So with the manufacturing and content side on board (some companies of it anyway), it's starting to look like HDR10+ could be the real deal.
So what makes it different from the others? I'm glad you asked.
Data, meta and otherwise
To explain the difference between HDR10 and HDR10+, we need to talk about metadata. Metadata is additional info, beyond the video signal itself, that gets transmitted along with an HDR movie or TV show. It basically tells the TV how to show the high dynamic range content. They're like secret Ikea instructions that turn your Billy bookcase into a library.
HDR10 has static metadata; HDR10+ and Dolby Vision have dynamic metadata. Since this is one of the biggest differences between HDR10 and DV, adding it to the license-free HDR10 is potentially a big deal.
With HDR10, the TV gets one set of instructions at the beginning of the show or movie. This single, static set says, "OK, when this show says jump, this is how high." This is fine, but is a one-size-fits-all approach. If a movie, say, has a wide variety of scenes, this single piece of metadata might not allow for the best image.
Dolby Vision has dynamic metadata -- and soon HDR10+ will, too. This allows for fine-tuning how the HDR looks not for the entire movie, but all the way down to per-scene or even a per-frame basis. Most content probably won't go that far, but this extra level of control lets filmmakers decide exactly how everything shot in a movie should look on your TV. Potentially, this could mean better picture quality over vanilla HDR10. Now a movie can give a TV instructions on how high to jump essentially on a continuous basis. (Very bossy.)
Here's how Samsung describes it:
HDR10+ provides for scene-by-scene adjustments for the optimum representation of contrast from the HDR source content. Being an open format, it's license/royalty free and therefore easily adoptable by manufacturers and content producers with quality maintained through a HDR10+ certification and logo program.
Oh, and just so there's no confusion, HDR10+ has absolutely nothing to do with Google's HDR+, an enhancement to camera phones. Similar names, totally unrelated. Well, they both have to do with HDR, but otherwise, not the same.
If you read all this and decided that HDR10+ exists because Samsung doesn't want to pay Dolby licensing fees for HDR, well, you'd be right. That's definitely the reason, though I'm sure they also just want HDR to succeed, too.
With the HDR10 ecosystem being a bit like the wild wild west, adding another layer of complexity could create additional problems. Will HDR10+ look the same, worse, or better than Dolby Vision? Impossible to say. Most likely it will come down to the specific transfers, content and so on.
Or to put it another way, it's probable that HDR10+ and Dolby Vision will potentially look about the same. Dolby's ace in the hole is, and will be, its hands-on involvement with the
themselves. A manufacturer pays Dolby not just for the ability decode Dolby Vision content. Dolby will also show them how to make their TV look as good as possible with said DV content. There's nothing like that on the HDR10 side -- but there might be with HDR10+.
You may have noticed in the quote above mention of a "certification." There are no details about this yet, but according to Samsung they hope to have something to announce by
2018 in January. They told CNET it will be a "quality-based certification and logo program for devices." What the level of performance TVs will have to meet to be certified is, we'll have to wait and see. This could be similar to what Dolby does, or it could be as simple as "yep, that's an image."
Lastly, the question you've probably wanted to ask this whole time: Will your TV work with it? Maybe yes, maybe no. Once again, Samsung:
Our entire 2017 lineup has an HDR10+-capable engine, and we will consider them for certification when the program is announced. We are evaluating options for 2016 displays.
Whether other companies will join the HDR10+ bandwagon remains to be seen. LG has Dolby Vision, and it's not like they jump at things created by Samsung. Other companies, we shall see. If they sign on, will they be able to firmware-update HDR TVs to work with HDR10+? I wouldn't count on it, but I suppose it's possible.