CNET logo Why You Can Trust CNET

Our expert, award-winning staff selects the products we cover and rigorously researches and tests our top picks. If you buy through our links, we may get a commission. How we test TVs

TVs are only getting brighter, but how much light is enough?

Recent prototype TVs with high dynamic range can get 15 times brighter than yours. Will you need sunglasses to watch them?

Geoffrey Morrison Contributor
Geoffrey Morrison is a writer/photographer about tech and travel for CNET, The New York Times, and other web and print publications. He's also the Editor-at-Large for The Wirecutter. He has written for Sound&Vision magazine, Home Theater magazine, and was the Editor-in-Chief of Home Entertainment magazine. He is NIST and ISF trained, and has a degree in Television/Radio from Ithaca College. His bestselling novel, Undersea, and its sequel, Undersea Atrophia, are available in paperback and digitally on Amazon. He spends most of the year as a digital nomad, living and working while traveling around the world. You can follow his travels at BaldNomad.com and on his YouTube channel.
Geoffrey Morrison
6 min read
Geoffrey Morrison/CNET

At CES 2018,  Sony demonstrated a prototype TV putting out a claimed 10,000 nits. That's 10-15 times brighter than your current TV.

Today's TVs are already significantly brighter than their predecessors, which can really help the image pop, especially in bright rooms. And while high dynamic range (HDR) delivers the best home video picture quality available today, to get the most out of HDR TV shows, movies and games, your TV needs to be really bright. 

With technologies like full-array local dimmingquantum dots and Micro LED, televisions are only going to get brighter. Which raises the question: How bright is too bright, anyway? 

Like most stuff that has to do with new TV technology, the answer can be complicated. Allow me to try to simplify it.


Nit is another name for "candela per square meter," a measure of the intensity of luminance. Or put simply and colloquially, "brightness." 

In the US we've historically used the foot-lambert, though thankfully that term seems to have disappeared from pretty much everywhere. Not that "nit" sounds much better than "lambert" but at least we're all speaking metric.

Nature has a lot of nits. The noonday sun measures around 1,600,000,000 nits and the night sky around 0.001 nits. Or, more relevant to us here, the average modern TV maxes out around 500-1,000 nits, and a movie theater screen maybe 50. We're starting to see some TVs pushing 1,500 nits, but that's still pretty rare and, in the case of the Samsung Q7, it gets dimmer after a few seconds.

Your TV at home, depending on its technology and age, maxes out at maybe 250-500 nits. I doubt you'd call it dim though, right?


Sony's 10,000-nit prototype. "Full-Spec" is a reference to the maximum light output in the current version of the HDR specification (specifically, Dolby Vision). Dolby has a PDF white paper that talks about that here. By the time a 10,000-nit TV reaches the market, though, who knows what will be possible with HDR content. 

Geoffrey Morrison/CNET

The best-looking demo at CES was the brightest

Sony's 10,000-nit prototype, which just happened to also be 8K, was stupendously bright. One wonders at the power required to drive such a beast. CES 2018 suffered a blackout -- the city of Las Vegas and the Consumer Technology Association blamed it on the rain, but maybe Sony's crazy demo TV was the real culprit. 

Seriously though, I've been reviewing and writing about TVs for 17 years and I've never resorted to this specific hyperbole: I think this prototype might be the most realistic image I've ever seen. Even in a darkened room off to the side of Sony's booth, the image was incredible. Highlights like reflections, headlights, really anything that required a spike of brightness, was so realistically bright, it added a whole new layer of realism.

And you know what, had you told me about how great it looked, I wouldn't have believed you either. I wear sunblock to change a nightlight, so there's no way I would have considered a 10,000-nit TV a good idea. And yet… now I want one. It really looked that good.

When we first started talking about HDR, Dolby had a reference monitor prototype that was "only" 4,000 nits, and it was impressive. And small. And insanely expensive. I remember discussions that TVs would be that bright "in a few years" and found it hard to believe. Now there's 2.5 times more light on the horizon.


Some approximate brightness ranges for a simple image of a flower. According to Dolby's research, "a system that could reproduce a range of 0 to 10,000 nits satisfied 90 percent of viewers."


Not everything on screen will be 10,000 nits

Of course there are potential issues. The first, as you can probably guess, is no one wants to have to wear Ray-Bans to watch TV. 10,000 nits is in the range of staring at the sky during midday, staring at a florescent office bulb, and other squint-worthy events not usually considered overly pleasant for long-term viewing.

The key? The TV isn't always 10,000 nits. Or at least, it shouldn't be. Ideally just a tiny part of the image, the brightest highlights, would be that bright. 

Imagine a character walking through the woods at night. Creepy, moody, dark. They take out a flashlight to see their way. The whole scene is still dark, just a few nits, but the just bulb on the flashlight could be 10,000 nits. As bright, perhaps, as if you were standing in that wood yourself. It would be more realistic, since the thing on screen that's supposed to be bright is actually physically bright. It's the beauty of HDR, potentially.

It works in daytime scenes too, with bright objects looking just like they do in real life. Imagine a glint off a piece of chrome, sunlight filtering through a tree or headlights flashing rapidly to get you out of the middle of the street (because you were staring at chrome and sunbeams). Your TV could look that realistic too.

Done right, most of the TV screen won't be much brighter than what you already have, most of the time. The benefit is headroom: when the content requires something to be super bright, it can be, just like real life.

Another good analogy is car horsepower. The vast majority of day-to-day driving in any vehicle only requires 50 or so horsepower. But if you're going up a hill, passing another car or getting up to speed on the highway, most cars need a lot more. 

This Sony prototype is like a 911 Turbo S. Puttering along at 65 mph it's barely awake, using very little power and getting 24 mpg. But if you need to do some, shall we say, spirited maneuvering, it has the ability to flatten your eyeballs and warp time and space.

Also like the Porsche, just because that performance is there doesn't mean you'll need to use it. If you get a shiny 10,000-nit TV, you'll be able to turn down that peak brightness if you find it fatiguing. Almost certainly, there will be some sort of night mode that will limit the overall light output. There will also likely be limiters that prevent the whole screen from getting too bright, just in case some ill-mastered content tries to exfoliate your retinas with a full-screen white image at full brightness.

But when the situation calls for it, and the right content controls it, these TVs will have massive and realistic performance waiting to be unleashed.

Long as I can see the light

It has long been an accepted truth that the brightest TV on the sales floor sells the best. In big-box stores, cranking the brightness certainly helps juice sales. Along with the advent of 4K, a comparatively dim image helped contribute to the demise of plasma TVs.

So how bright is too bright? Even though the human eye is capable of seeing a huge range of brightness, there's a comfort limit. If you've ever gotten eye fatigue from staring at a screen in a dark room, you're familiar with this limit. It varies per person, and per situation. There's no set amount. 

With light output assuredly heading for the exosphere, however, it's going to be important that TVs, and filmmakers, keep a tight leash on this potential performance. Like flamethrowers in the hands of teenagers, these are incredible tools that could be used with terrible results. The best TVs of the next decade won't necessarily be the brightest, but the ones that are best able to wrangle that brightness to create a punchy, realistic, easy-to-watch image.

Hmmm… just like the TVs of today.

Read more: The TVs of tomorrow will turn invisible

Got a question for Geoff? First, check out all the other articles he's written on topics like why all HDMI cables are the sameTV resolutions explainedLED LCD vs. OLED and more. Still have a question? Tweet at him @TechWriterGeoff then check out his travel photography on Instagram. He also thinks you should check out his best-selling sci-fi novel and its sequel