CNET también está disponible en español.

Ir a español

Don't show this again

TVs

What are nits, and why are they important for your next TV?

As TVs get brighter and brighter, manufacturers are using their brightness, often measured in nits, as a marketing tool. Here’s what you need to know.

Sarah Tew/CNET

If you've read any TV reviews lately, or anything about modern TV technology, you've likely come across the term "nit."

The short version is that it's a colloquial term for a unit of brightness. That description might be enough for you. Lots of nits = lots of brightness, which helps the image look better in a bright room or with HDR TV shows and movies.

If you're still reading, I assume this is because you're looking for something a bit more in depth. I've got you covered.

Nits galore

A "nit" is another way to describe a brightness of 1 candela per square meter (cd/m2). An average candle produces roughly 1 candela. Now you know where the name comes from. Happy birthday.

That amount of light, spread over a square meter, is one nit. Or to put it another way, imagine a box, 16-inches (40.8cm) on each side, with a candle in the middle. The total amount of light hitting the interior surface of that box is 1 nit.

But let's talk about the stuff that interests us here. A movie theater screen, in your average movie theater, can probably get as bright as about 50 nits. If your TV is a few years old, pre-HDR, it can probably reach between 100 and 400 nits. Plasmas (now defunct) would be on the low side of that, while high-end LCDs on the other side.

Modern TVs can be much brighter, with the top-of-the-line HDR TVs putting out over 1,500 nits. In the next few years, we'll likely see even higher light outputs. Sony, at CES 2018, showed a prototype TV capable of 10,000 nits.

sony-10000-nit-prototype

Sony's stunningly gorgeous 10,000 nit "full-spec" HDR prototype shown at CES 2018. Even in a darkened room, the image wasn't too bright to watch, it was just impressively realistic. 

Geoffrey Morrison/CNET

Here's where we remind you that a manufacturer's claim and a real-world nits number are often not the same. In CNET's TV reviews we measure the light output in nits of every TV for both HDR and standard material, and we've found that some TVs live up to the claim and some do not. And some have other imitations, for example the tendency of some Samsung TVs to vary their light output over time, dropping to half brightness or less after a certain period. Caveat emptor.

What about feet?

You may have seen mention in TV and projector reviews of "foot-lamberts." This is the Imperial version of nits, so they're directly comparable: 1 nit = 0.29 foot-lamberts. So a TV that puts out 1,000 nits is putting out 291.9ftL. Generally, though, everyone (including CNET reviews) uses nits now.

Lumens?

Another measure of light is called lumens, and this… gets complicated. For what CNET deals with, lumens typically only apply to projectors. It tells you how much light energy something is throwing out, but not exactly how "bright" it will appear. That's because you're not looking directly at a projector. If a projector has "2,000 lumens" for example, it's going to appear differently bright whether you use a 50-inch screen or a 150-inch screen.

Or to flip it around, a 500 nit phone and a 500 nit TV are going to appear equally bright to your eye. But a 2,000 lumen projector on a 50-inch screen is going to look WAY brighter than a 2,000 lumen projector on a 150-inch screen.

So, does that mean two 2,000 lumen projectors will appear equally bright on the same size screen? Nope. That'd be too easy. There's no specific method to measure lumens, so manufacturers can fudge these numbers quite a bit. It's doubtful a 2,000 lumen projector will be dimmer than a 1,000 lumen projector, but take the numbers, any manufacturer-supplied specs really, with a grain of salt. An exception to that is ANSI lumens, which specifies the method of how to measure the light. Those numbers should be largely comparable across projectors.

Or to put it simply, a projector is measured in lumens, the image it projects on a screen is measured in nits, just like a TV. Since projector manufacturers don't know what size or gain screen you're going to use, it's a lot easier to say "1,000 lumens" than "300 nits (on a 100-inch, 1.3 gain screen in a dark room with the projector sitting at screen height, unzoomed, in the Bright picture mode...)."

Neat nits

TV manufacturers have always striven to create bright televisions. The brightest TV is the one that sold, or so the old adage went. Now, in the HDR era, this brightness has another purpose: picture quality. One of the main aspects to HDR performance is creating realistic highlights. The brighter these small areas of the screen are, the better. Imagine, for example, a glint off an aircraft's metallic skin. In real life, this with be significantly brighter than the rest of the scene. On a great HDR TV, it is as well.

This isn't to say a 2,000 nit TV is going to always look better than a 1,500 nit TV, but it can be a factor. Brightness (nits) is only one half of the all-important contrast ratio equation; the other is black level. Meanwhile new technologies like quantum dots are pushing overall performance, including brightness, to levels we couldn't have imagined 10 years ago.

So there you go, all the nits fit to pick. 


Got a question for Geoff? First, check out all the other articles he's written on topics like why all HDMI cables are the sameTV resolutions explainedLED LCD vs. OLED and more. Still have a question? Tweet at him @TechWriterGeoff then check out his travel photography on Instagram. He also thinks you should check out his best-selling sci-fi novel and its sequel.

Best TVs right now: The past year's best sets, all in one place.

Smart home compatibility tool: Find out what smart home platforms work with your existing kit and vice versa.