CNET reader "Nindevo" asks:
In regards to your articles about the HDMI cables, I was just wondering why TVs have "noise reduction" settings. I thought digital signals (HDMI) couldn't have noise.
This is actually a pretty common mistake. There are actually three things going on: the cable, the signal, and the TV.
1) The cable
The cable, in this example HDMI, does nothing more than transport the signal. Think of it like an aqueduct or sewer. It just carries the information, and doesn't care how much crap is in it. The HDMI cable itself has no effect on the picture quality and is not a cause of any video noise.
With analog cables, like component, they can have a subtle effect on the signal (attenuating it, possibly), but only over long runs or in cases of extreme interference.
Check out my articles
2) The signal
The signal itself, however, can have lots of noise. Older VHS tapes or some cable/satellite channels, for example, can be noisy. To some extent, a TV's noise reduction circuitry can minimize this.
Most Blu-rays are authored in such a way as to minimize noise. This is because early BDs accurately transferred film grain, and many people mistook this for noise. So even if the film has grain these days, it's often scrubbed clean and pristine.
Generally speaking, though, noise in the signal is often deliberate on the part of the filmmakers to give their movie/TV show a visual "style."
3) The TV
The most common cause of video noise is the TV itself. If you're watching a standard-definition (480i) signal, such as DVD or SD cable/satellite channels, the TV has to. This process has to create millions pixels every second out of thin air, so to speak. Most modern TVs are pretty good at it, but there are unwanted side effects. In an effort to make the image as detailed as possible, the TV might end up accentuating the noise more than the actual detail in the image. This is a delicate trade-off on the part of the TV manufacturers: lots of detail but some noise, or less detail but no noise. Some of the best scalers are able to have lots of detail and little noise. The worst, the opposite.
Adding noise reduction in these cases may create a "smoother" image, but it's also likely robbing you of some fine detail.
Another culprit in the noise department is the Sharpness control. This is always an "edge enhancement" control. It adds an artificial edge around sharp lines to give the appearance of detail, when in reality it's doing the opposite (masking fine detail). It can also accentuate small amounts of noise in the image. If your TV has lots of video noise, try turning down this control a lot. For that matter, this control should be close to 0 on all TVs. Try it. At first the image will appear softer, but as you get used to it, you'll start noticing a lot more fine details like hair, wrinkles, textures in clothes, and so on. In some cases, though, turning the Sharpness control fully off does actually soften the image (for reasons no sane person can figure out).
Check out my article
Lastly, different TV technologies can create their own video noise. The way plasma TVs create an image requires dithering to create a smooth picture. With modern plasmas, this is only noticeable if you put your face right up to the screen. If you're seeing video noise with a plasma from your couch, it's not likely the dithering, it's one of the other reasons mentioned above.
Got a question for Geoff? Click "Geoffrey Morrison" below then click the "E-mail" link in the upper right to e-mail, wait for it...Geoffrey Morrison! Put "Morrison's Mailbag" somewhere in there. If it's witty, amusing, and/or a good question, you may just see it in a post just like this one. No, I won't tell you what TV to buy. Yes, I'll probably truncate and/or clean up your e-mail. You can also send me a message on Twitter @TechWriterGeoff.