X
CNET logo Why You Can Trust CNET

Our expert, award-winning staff selects the products we cover and rigorously researches and tests our top picks. If you buy through our links, we may get a commission. How we test TVs

What is gamma and HDR EOTF on TVs, and why should you care?

No, we're not talking about the radiation that created The Hulk. Here's why shades of gray mean so much to TV image quality.

Geoffrey Morrison Contributor
Geoffrey Morrison is a writer/photographer about tech and travel for CNET, The New York Times, and other web and print publications. He's also the Editor-at-Large for The Wirecutter. He has written for Sound&Vision magazine, Home Theater magazine, and was the Editor-in-Chief of Home Entertainment magazine. He is NIST and ISF trained, and has a degree in Television/Radio from Ithaca College. His bestselling novel, Undersea, and its sequel, Undersea Atrophia, are available in paperback and digitally on Amazon. He spends most of the year as a digital nomad, living and working while traveling around the world. You can follow his travels at BaldNomad.com and on his YouTube channel.
Geoffrey Morrison
8 min read
Geoffrey Morrison/CNET

Gamma. Unless you're talking about Bruce Banner, it's probably not high on the list of interesting things. In fact, in this context, we're not even talking about the highest-energy form of light. Gamma rays, especially in space, are super interesting and cool compared to gamma on TVs.

In fact most people have probably never even heard of gamma in the context of a TV or projector. But that doesn't make it any less important to picture quality. An obscure, behind-the-scenes process, called gamma correction, is crucial to how your TV has looked for decades. And its current-and-future incarnation, the Electro-Optical Transfer Function (EOTF), is equally important to how TV pictures will look going forward into the age of high dynamic range

Getting to know how gamma and EOTF work will give you a better idea how your TV and video itself works, and provide a better idea what the gamma setting on your TV "should" be. Spoiler: there's no simple answer.

This conversation could easily devolve into math, which is even more boring than talking about gamma itself, so instead, lets talk about gray.

Shades of gray

Which of these images of a plane looks correct? 

palm-springs-am-gamma-side-by-side
Enlarge Image
palm-springs-am-gamma-side-by-side

Note the difference in the brightness of the shadows (foreground), mid-tones (background plane and mountains), and highlights (clouds).

Geoffrey Morrison/CNET

The above photoshopped illustration gives you an idea of what different gamma settings would look like. The photo is from my Instagram and my tour of the Palm Springs Air Museum.

As far as I'm concerned, the correct image is the middle image, since that's what I wanted you to see. Your TV, though, might show you something more like the one on the left or right. For that matter, you might adjust your TV's gamma setting (if it has one), to look more like the left or right image. And that's… fine. I mean, I'm not thrilled because it's my picture and you're making it look terrible. But in the context of standard video, you can adjust the gamma as you see fit.

Essentially, gamma is the conversion between what the incoming video signal says, and what the TV will create. It's a curve used on the encoding side, like a video camera, and an alternate curve on the decoding side. 

Historically, the gamma curve was a way to counteract the way ancient tube  TVs displayed an image, and was built into the video cameras themselves. In the modern era of flat-panel TVs, it's ideally used to tailor image quality to room lighting.

gamma-curves-alt-contrast

Examples of three gamma curves. 

Geoffrey Morrison/CNET

You can see examples of three gamma curves above. Linear, on the left, is what you'd expect: A 1:1 ratio between the incoming video and what the TV produces. But in reality it can be more like the others. A low gamma, with a shallow curve like the middle, is more appropriate for bright rooms and non-movie content. The higher gamma, on the right, is typically better for movies and darker rooms. In that example, all shadows (represented on the lower part of the curve), will be darker. The brighter parts of the image (upper right of each graph) aren't as affected. 

If you adjust the gamma on your TV, it changes the apparent "grayness" of blacks, shadows, midtones, and to a lesser extent, even highlights. Gamma describes how the image transitions from black to white, and affects all the grays in between. 

A high gamma, that is a significant curve, means a wider range of shadows will be darker. It can make an image look dark and contrasty, and can obscure details in shadows. It's the reason why TV reviewers often harp on shadow detail. A low gamma has a shallower curve, so shadows will appear brighter. It can make an image look washed out and flat. (There's a bit more to it, but I'll get to that in the next section.)

Settings if you've got 'em

So what's correct? Well, that depends who you ask, what you're viewing, and interestingly enough, where you're viewing it. 

Typically gamma curves are represented as numbers, and which curve looks best can be a matter of taste. Some viewers like 2.4 while others, including myself, find that looks way too dark. On the other side of the equation, 1.8 has a computery feel to it, and can look lifeless and washed out. I tend to prefer something around 2.2, but really, it's up to you. "Film purists" will scream 2.4 is the only option, gamers might say 2.0 lets them see into the shadows better in an otherwise dark game.

As I alluded to earlier, the curve number itself doesn't take into account the environment where you're viewing. If you're watching TV in a bright room, 2.4 will seem too dark, with the shadows too hard to see. If you're watching in a dark room, 2.0 might seem too washed out, with the shadows unnaturally bright.

That has to do with how your eye sees, as demonstrated by these two gorgeous boxes below.

gray-illusion

This illusion illustrates how light surrounding an image affects its perceived brightness. 

Geoffrey Morrison/CNET

Believe it or not, the squares above are the exact same shade of gray, but for most viewers the one on the left looks brighter. They only appear different because of what's around them. In the real world, the box is your TV, the area around it is your room.

If your TV has a gamma setting, find some scenes that place at night or are otherwise dark, and play around with it. Just do so when you normally watch most of your content, or be ready to change it if you watch something at a different time of day.

Generally speaking, increasing your brightness control will have a somewhat similar effect as changing the gamma (raising the brightness of shadows), but usually this is just bumping up the bottom of the curve. So the curve stays the same, but the darkest it can get goes up. 

How all of these settings ultimately work depends on the TV. The best looking images will be ones with a gamma curve you find pleasing, and the brightness control set to the lowest it can be without making the details in the shadows disappear.

More detail about gamma isn't strictly required, though you're certainly welcome to check out more. This is because the TV world is moving rapidly to get rid of gamma. Sort of.

A whole new world (of HDR)

OK, forget about gamma for a moment. With the advent of HDR, this fundamental part of how TVs work has radically changed. Instead of gamma, HDR uses an EOTF, or Electro-Optical Transfer Function. Technically "gamma" is also an "EOTF," but I'm going to refer to them as separate terms to keep things simple.

Wait, don't leave! This sounds confusing, but it's actually way more logical than "gamma." HDR's ETOF essentially dictates a specific real-world brightness level. Gamma, and all previous content, gave a TV the instruction, "produce 20 percent of your maximum brightness." But the EOTF says, "produce 200 nits." 

So that's what that complicated title means: "Electro," refers to the electronic info in the HDR content. "Optical" refers to the light you get out of your TV. And "Transfer function" is a fancy way of describing how to get one to the other.

This is more or less what gamma correction was doing before. It's just a bit more direct now. "Gamma" is more relative, and "EOTF" is more specific. 

With gamma, there was no way of knowing how bright a TV was when it got into your home, and no way of knowing how bright its maximum was. Maybe "20 percent of maximum" was 200 nits, but it could have just as easily been 2 or 20. That's a massive difference, and it was hard for content creators (directors, producers, and so on) to ensure that what you saw at home was what they intended for you to see.

With HDR's EOTF, it's… easier. Not exactly easy, but better than it was. At the mastering stage, the content creators can say "OK, I want the brightest part of my show to be 1,000 nits." That refers to the brightest visual moment, such as a glint off a window or a flashlight in the darkness. The mastering team then builds the rest of the brightness levels around this: This shadow is 50 nits, that cloudy sky is 600 nits, and so on. 

When you play back this content at home, your TV will produce the exact physical brightness the content creators saw when they made the show or movie. The result is more accurate representation of their vision.

That's the idea anyway. It's not perfectly simple, largely because not all TVs can actually produce the 1,000 nits -- and in some cases up to 4,000 -- required by the content. If a TV can't produce the required amount of light, it will either re-map it (compressing down, essentially), or clip it off altogether. To put it another way, the TV would act as a low doorway, and any tall people going through would either get smushed down so they fit, or they'd get their heads chopped off. Neither change is great and both are permanent.

sony-side-by-side

Two projectors, side by side. 

Geoffrey Morrison/CNET

Notice how there are three individual lights in the left image in the photo above, but a single blob of light on the right. This is an example of different approaches to HDR processing. Neither projector can create the required nits, but the one on the left one is showing you the full highlight detail while sacrificing brightness. The other is clipping it the detail, effectively lopping it off, but creating a brighter highlight.

While not perfect, HDR's EOTF is a better way to produce content for TVs. How well lower-cost TVs handle HDR content they can't physically show will be a big factor in their overall performance. The image above, with the two projectors side-by-side, is a good example. Neither can fully produce the brightness required of the HDR content, but one is doing a much better job fooling you into thinking it can.

The once and future gamma

Though a relic of the past, gamma on TVs isn't going anywhere for a long time. We've got over 70 years on non-HDR content that's still going to get watched. And it will be a while before HDR TVs are majority. Largely speaking, all of this is done behind the scenes, embedded in the content itself, and done by your TV automatically. Not always, though.

If you've got a projector, or a higher-end TV, it's worth digging through the settings and see if you can adjust the gamma. It's possible you'll like a different setting than the stock function. So hopefully this will give you a better idea what the adjustment does, beyond making the shadows look brighter.

If not, next time you see EOTF, you'll at least be able to say "I know all about the electromaniacal total feature. It has to do with the Hulk."


Got a question for Geoff? First, check out all the other articles he's written on topics like why all HDMI cables are the sameTV resolutions explainedLED LCD vs. OLED and more. Still have a question? Tweet at him @TechWriterGeoff then check out his travel photography on Instagram. He also thinks you should check out his best-selling sci-fi novel and its sequel