Blu-Ray (if outputting 1080p) will not downconvert any signals. What happens is that the BD player will send a 1080
- Blu-Ray player will send out a 1080p singal to the tv via HDMI
- The tv (since you mention it doesn't accept a true 100p signal), will downconvert the signal to 1080i and..
- Since the tv's resolution is actually 1080p, the tv will upconvert the signal it downconverted when the incoming signal from the blu-ray player went into the tv so it can actually match the tv's resolution.
Sounds confusing but it's going to be a 3-way operation. Of course this only takes fractions of a second to happpen but, in the end, you will not see any compromise in picture quality because the signal will be brought back to its original resolution in the end.
if anyone else like to add anything else, please do so.
Other than the HP MD80N line of tvs, nothing else right now takes a true 1080p signal from HDMI. So my question is what happens in the following situation:
-blu-ray hooked up to a 1920x1080 native res (1080p) tv that cant accept true 1080p from HDMI. So we set Blu-ray to 1080i output through hdmi. So, the tv de-interlaces the image and makes it progressive......and my questions
1) is what I typed correct?
2) if so, how much compromise in picture quality are we looking at when a 1080p native displays true 1080p signal as opposed to 1080p native displaying a 1080i signal.

Chowhound
Comic Vine
GameFAQs
GameSpot
Giant Bomb
TechRepublic