Most content is not digital but being pulled from film. It is entirely possible for there to be noise.
I am just curious...can anyone out there explain to me how an HD picture can contain video noise? I thought that HD is digital which means it is essentially 0s and 1s. Coming from that logic the HD broadcast should be perfect every time, unless there is actually signal interruption.
Yet on my XBR4 LCD TV there is quite often noise present in the HD broadcast and when I enable noise reduction, it ussually cleans up the image really nicely, but I don't understand why and how there is noise in a first place in a digital signal?
Also, on my 1080P upconverting DVD player, when I play DVDs there is also an amount of noise in the picture that is cleaned up by my TVs noise reduction. It's connected through HDMI so once again...how does a digital signal contain noise?
My only guess is that the noise is present in the material itself, meaning it is part of the original piture on HD broadcasts, although I am still not sure about DVDs.
If someone is technical enough to explain it to me, I'd appriciate it. I was unable to find anything useful when Googling on the subject.