Contrast ratio is the most important aspect of a TV's performance. More than any other single metric, a set's contrast ratio will be the most noticeable difference between two TVs.
That is, if you could juxtapose them. Which you can't. Or if you could compare their claimed specs. Which you can't.
Understanding what contrast ratio is and how to judge it will help you determine the best TV for your dollar. But it's a lot harder than it sounds.
In its simplest form, contrast ratio is the difference between the brightest image a TV can create and the darkest. In another way: white/black=contrast ratio. If a TV can output 45 foot-lamberts with a white screen and 0.010 ft-L with a black screen, it's said to have a contrast ratio of 4,500:1.
Unfortunately, it gets more complex from there.
There is no standard as to how to measure contrast ratio. In other words, a TV manufacturer could measure the maximum light output of 1 pixel driven at some normally unobtainable maximum, then measure that same pixel with no signal going to it at all. This hardly represents what you'd see at home, but without a standard, such trivialities don't matter to TV manufacturers.
Worse, contrast ratio numbers have gotten so extreme, there is literally no way to measure some of them. What happens more often than not is the marketing department will come up with the number it needs to sell the product. The engineers will shuffle their feet, and stare at the wall, and magically the TV has that contrast ratio.
The only way to get realistic contrast ratio numbers is from reviews, but even this isn't always accurate, as we'll see.
Contrast ratio: Good and bad
Because you're reading this article on a device that has its own contrast ratio, I can't give you real examples of what good and bad contrast ratios look like, so I'll have to fake it. If you can, make sure your computer monitor is set decently; you can use. Below is an example of an image with good contrast (left), and one with bad.
Pretty easy to see that you'd want the one of the left, correct? The image on the right has a higher black level, and if I were demoing two different TVs with this image in front of you, you'd notice that the lights aren't as punchy on the right TV either.
Native vs. dynamic
There are two more aspects of contrast ratio. Most often these are referred to as "native" and "dynamic." Native contrast ratio is what the display technology itself can do. With an LCD, this is what the liquid crystal panel itself is capable of. With DLP, it's what the DMD chip/chips can do.
Imagine putting the image above on your TV's screen. Native contrast ratio is how dark the darkest parts of the image are, compared with the brightest parts of the same image. I like to call this "intra-scene contrast ratio" though I'm certainly open to something better if anyone has an idea.
The reason there's a distinction is due to most TVs now having a dynamic contrast ratio. This is a broad term to describe technologies that augment the native contrast ratio of the TV. These work by having the TV sense what content it's showing, and adjust the overall light output accordingly. If you've ever, the TV is basically doing this in real time depending on the video.
When an adjustable backlight, or a projector's iris, is used in conjunction with circuitry to monitor the video signal, it is able to adjust the overall light output in real time depending on what's onscreen. This dynamic contrast ratio looks like this:
A bright image is bright, a dark image is dark. Done well, this does increase the apparent contrast ratio of a display, but not nearly as much as the numbers would suggest. A TV with 5,000,000:1 contrast ratio would be unbelievable to look it. Too bad one doesn't exist. A TV with a high dynamic contrast ratio may look better than a TV that has no such circuitry, but it won't look as good as a display with a high native contrast ratio.
Yes, the LED's of an LED LCD can turn off, creating a true black, but it will never do this when there is any amount of video on the screen. Picture the end credits of a movie. A display with a high native contrast will show this as a dark black background, and punchy white text. A display with a high dynamic contrast ratio may have a similarly dark background, but the text won't be bright.
Again, I can't make the image on your screen darker or lighter, but here's kinda what it would look like:
As you can see, a display with a high native contrast is the way to go, if that's what you're going for. The night sky is black, but the streetlights pop out. The day sky is bright, but the dark jacket is dark. This is more like CRT, more like film, more like life.
The technology with the highest native contrast ratio is... LCOS. At the moment, JVC front projectors using their version of the technology (D-ILA) have the highest native contrast ratios I've measured. Sony's version (SXRD) comes in a rather distant second. Third is plasma, though some DLP projectors are close.
LCD has come a long way in the past decade, but still lags behind the other technologies. Thankfully, the better LCD manufactures know this and have come up with a few ways to mimic the high native contrast ratio of the other technologies.
The best way to get a high intra-scene contrast ratio with LCDs is with local dimming. This is when the backlight of the LCD is an array of LEDs, all of which can dim depending on what's on screen. It's not done on a per-pixel level, but LED zones are generally small enough that the overall effect is quite good. It's far better than what the LCD panel can do itself. The downside is an artifact known as "halos" where the LEDs are lit behind small bright areas of the screen, but these areas are visible because the other parts of the screen are dark. This is very noticeable on specific types of content (like movie credits or star fields) but generally local dimming works really well. I was going to Photoshop some halos onto a screenshot of the one movie where I actually had a screen credit, but it came across more douchey than helpful.
Unfortunately, most manufactures have moved away from full array LED backlights, which are the only kind that can do local dimming well, because of the cost.
Most LED LCDs these days are "edge lit," as in their LEDs are along the sides (or the top and bottom, or both). Several companies have developed methods to dim areas of the screen with LED edge lighting, though the effect isn't as good as full array LEDs. Again, every bit helps though, and many edge lit LED LCDs look amazing.
Check outfor more info on this.
Measuring, a whole other problem
You may be asking yourself: How can you, as a consumer, find out what display has the best contrast ratio? Good question. You can't tell in a store, as the store lighting will throw off any comparison (biasing towards LCDs or TVs with antireflective and/or antiglare screens that have better ambient light rejection). As mentioned, all manufacturers manufacture their numbers with little basis on reality, so spec sheets are out.
So that leaves reviews. Sadly, few review sites measure contrast ratio, and those that do don't have consistency between them. There is no set standard for reviewers on how to measure contrast ratio either, so numbers are going to be extremely different. I may measure 20,000:1, while Joe Numbnutz over at TVAwesomeReviews.com measures 1,000:1 with his Datacolor Spyder (a decent product, but not a valid measurement tool for contrast ratio).
And then what do you measure? I would say a black field (0 IRE) from a DVD/Blu-ray or signal generator, and a white field (100 IRE) from the same, in the same mode,. This certainly gets a decent view of the overall contrast ratio, but isn't terribly relevant with actual video (which is never totally dark or totally white). Also, what about a TV where the dynamic circuitry can't be disabled? That's not a valid measurement, when compared to those that can. Or how about displays that actively limit the total current draw (all plasmas). With these, a full white field will be significantly darker than what's possible on smaller areas of the screen.
ANSI contrast ratio is a good addition. This is where eight-each white and black boxes in a checkerboard pattern are measured and averaged. This gives a good idea of what a display is doing, and is far more relevant to compare to actual video. Even this, though, is problematic, as the brightness of the white boxes can affect the measurement of the black boxes. Done right, it is also exceedingly time consuming. When I started measuring ANSI contrast ratio when I was at Home Theater, it nearly doubled the total amount of time spent measuring a television. Spending that much time on one measurement that most people will overlook is not an effective use of time.
I hate to say it, but there is no good answer. Yep, 1,500 words to get to that conclusion. Sorry. The best we can hope for is reasonably accurate measurements from sites like CNET to give a general idea of what's going on, and the knowledge from the rest of this article and others like it to extrapolate what the performance will be in your home.
Like nearly all TV buying guides say: It's all in what you want to do with the TV. If you're a movie buff and you watch TV in a dark room or at night, the added contrast of plasma will be very cinematic.
If you watch a lot of TV during the day, the brightness of an LED LCD can't be beat.
Somewhere in between is an LED LCD with some kind of local or zone dimming, offering better intra-scene contrast ratio than a "normal" LCD, but still offering that technology's extreme light output.
Personally, I skip the TV altogether and just go with projection. LCOS is great.
No matter what, when you get your TV home, it's vital you
Got a question for Geoff? First, check out all the other articles he's written on topics like , , , and more. Still have a question? Send him an e-mail! He won't tell you which TV to buy, but he might use your letter in a future article. You can also send him a message on Twitter: @TechWriterGeoff.