Thank you for being a valued part of the CNET community. As of December 1, 2020, the forums are in read-only format. In early 2021, CNET Forums will no longer be available. We are grateful for the participation and advice you have provided to one another over the years.

Thanks,

CNET Support

General discussion

optical or coax-720p or1080i

Dec 6, 2010 6:16PM PST

Can someone delineate the specification differences, between using coax, say orange connection; and using the optical fiber connection, for 5.1 surround sound. Can they both do 96khz/24 bit? Is the bit rate higher for optical? Does it matter at all? Is the compression different, etc? It also seems like I have seen another audio connection (perhaps SPDIF) on some TV's in lieu of an orange coax. Maybe a black colored connector. Any ideas on that connection, if I am not confusing something there.

Interlaced and Progressive scan
I have successfully confused myself on resolution variability and settings. My understanding of past, was you either had a progressive or interlaced TV, to scan the picture being displayed. Now the HD sets seem to do both, and you throw in up-scaling in between. I think I understand P or I from a conceptual understanding; but not exactly sure how the resolution metemorphesis pans out. For example: if I have my TV set to 720p on component input, and a 1080i image is input from the source; what does the TV do with it? Also which is better, and when, 720p 0r 1080i? If someone can explain, or point me to a good source, thanks a lot! I think my confusion stems from the way they market resolution to the consumer; in relation to devices and displays.

Discussion is locked