48 total posts
(Page 1 of 2)
"do i really need 1080p?" Well NO you don't!
Do some reading at this link.
There will NOT be any 1080p broadcast for many, many, many years, if ever!
Currently a 1080p set will cost $500~800 more than a 720p set.
The only 1080p source material will be on HD-DVD/Blue Ray DVDs and maybe some games, thats it.
Digital HDTVs have a native resoultion 720p, 1080i or now 1080p. They will rescale/deinterlace other video formats to that native respoultion.
A 720p set will upscale 480p DVDs to 720p and downscale 1080i/p to 720p.
A 1080p set will upscale 480p and 720p to 1080p and will deinterlace 1080i to 1080p.
If you have any more questions just ask . John
Need, no; want, maybe
A 1080i signal is not scaled to 1080P, it is de-interlaced. In general a 1080i or 1080p signal from the same source would look identical on a 1080p display as long as the interlacing and de-interlacing are done properly.
If you have a 1080P TV and you receive a 1080i (or P) input you will get the full 1080 resolution. If you have a 720P TV and you receive a 1080i (or P) input the TV will have to scale the input to 720P and you will loose about 1/3 of the resolution inherent in the 1080 input.
It is hard to argue that anybody ''needs'' the extra resolution, many people are happy with their 480i standard TVs. In the next several years the gap in cost between 720P and 1080P TVs will be reduced, in the meantime it is a matter of personal preference, and pocket book, if the extra resolution is worth the extra cost.
Isn't 1080i somewhat of a mis-name? Wouldn't a more accurate name be 540i? Isn't it 540 lines displayed twice to give us 1080i? I read something like this elsewhere and it seemed to make sense. I know that broadcast channels have chosen both 720p and 1080i as HD formats. I would assume that a 720p broadcast (like Leno) would look better on a native 720p TV than a 1080i broadcast, and a 1080i broadcast (like Letterman) would look better on a 1080i set. I just figured that in this case, the TV would not need to rescale (interlace) the picture. That would be an interesting side-by-side comparison.
540i would be 540i, 1080i is 1080i
A 540i signal would be 540 lines divided into (2) 270 line groups, 1 - 539 odd numbered lines and 2 - 540 even numbered lines. Every 1/60th of a second the lines would be displayed, alternating between the odd and even lines, given you 540 lines of resolution.
A 1080i signal would be 1080 lines divided into (2) 540 line groups, 1 - 1079 odd numbered lines and 2 - 1080 even numbered lines. Every 1/60th of a second the lines would be displayed, alternating between the odd and even lines, given you 1080 lines of resolution.
When the 1080i signal is de-iterlaced the signal is combines so that all 1080 lines are displayed 60 times per second. Whether 1080i or 1080P the resolution is still 1080 lines.
As described in an earlier post, 1080P DLP sets achieve this by displaying the odd and even lines every 1/120th of a second so each line is displayed 60 times per second. Whether the input is 1080i or 1080P it is displayed the same.
Scaling and interlacing/de-interlacing are 2 different things.
When an interlaced signal is de-interlaced the resolution is not changed. The odd and even lines are merged together and they are all displayed at the same time (normally 60 times per second) but the content and format of the picture are the same.
When a signal is scaled the content and/or format of the picture is modified. If we start with a 480i input and want to display a 720p picture the scaler would take the 480 lines of the original input and extrapolate the additional 240 lines to insert between the original 480 lines in a fashion that would look natural. We have not added any detail because we still only have the details that were contained in the original 480 line input. If we start with 1080 lines of input (interlaced or progressive) and need to scale to 720 lines 360 lines have to be removed and the remaining lines have to be modifies to make the picture look as natural as possible.
This is a very simplified discussion intended to show the difference between de-interlacing and scaling, the actual processes are much more complicated.
you explain very well a complex subject
Yes, as the other poster said, a very good description. So, if I understand your description of scaling, there doesn't seem to be a proper HD format because there is 720p and 1080i broadcasts, and unless we match a TV with the proper broadcast format, our TV is doing some scaling. And if there are more broadcasters sending 1080i signals, does it make sense to buy a 1080i TV rather than a 720p TV?
For the most part, the scaling is very good
All other things being equal, I would buy a 1080P set. However the scaling being done in most sets is very good. We are not talking about night and day differences.
Sometimes when we get to discussing that one is better than the other, and this one is better yet, it's easy to forget that the improvements are usually modest incremental differences.
Also there is a lot more to a good picture than the resolution. I was recently looking at TVs at a retailer and the biggest difference I could see between the TVs was the black level (often referred to as contrast level) and color accuracy.
I recently replaces 2 720P DLPs with 1080P DLPs. The improved black level was a more important improvement, to me, than the increased resolution.
One last note; CRT based TVs are not fixed pixel and they work differently. Suffice it to say that CRT TVs can usually play any signal (up to the maximum they can display) without scaling.
Thanks!! You have save me a LOT of typing!!!
I have been saying this for the last two years, Thanks again!
Down scaliing is easer to do than up scalling.
With up scalling extra pixils must be "created" from the existing ones, if the up scaling dose not do a good job the result will look choppy, blured,etc. This was a problem on many HD sets over the last two years. John
540i would be 540i, 1080i is 1080i
Very informative post. You say that:
"When the 1080i signal is de-iterlaced the signal is combined so that all 1080 lines are displayed 60 times per second. Whether 1080i or 1080P the resolution is still 1080 lines.
As described in an earlier post, 1080P DLP sets achieve this by displaying the odd and even lines every 1/120th of a second so each line is displayed 60 times per second. Whether the input is 1080i or 1080P it is displayed the same."
So it seems that a 1080p DLP set is actually showing an interlaced display, in that the odd and even lines are still displayed alternately, not progresively. The picture fidelity is improved by doubling the frequency of the display, not by combining the lines into a progresive display.
A Little Clarification Here For....
....people new to video. You said:
"If you have a 720P TV and you receive a 1080i (or P) input the TV will have to scale the input to 720P and you will loose about 1/3 of the resolution inherent in the 1080 input."
This might lead a newbie to presume that the 720p display is only 67% as good a picture as the 1080p set. Actually, this increase in resolution is barely noticible if at all depending on the viewing distance from the TV. This is important to know as most viewers at maybe 8-10 feet or more might not be able to see a difference between 720p and 1080p.
A new high definition DVD movie (a quality one if anyone ever releases one!) will look superb on both sets and slightly (emphasis on slightly) better on the 1080p.
Yes, the differences can be modest
I agree, it is easy to describe the difference in a way that is greater than the differences that we actually see. There are also more variables involved in the quality of a picture than resolution.
HDDVD doesn't support 1080p
What are you talking about?
HD-DVDs are mastered in 1080p. Any 1920x1080 set will just deinterlace the 1080i and make it progressive. Expect the next HD-DVD players to output in 1080p. That's a false advantage of BD. Don't make overt Noob statements like that please.
This forum is for everyone in general. Some folks can get testy at times but one of the valuable saving graces here is the good natured info the more highly informed post, sometimes with a jab of humor.
Noobs (or Newbies) inhabit the space with industry insiders whether that be at retail, broadcasting, etc. or just knowledge offered by the occasional engineer, for instance.
Theoretical info (to a small degree, scaled down to beginner intro) just fine general knowledge and just personal experience all make this place a pleasent visit.
Surely just plain wrong claims are sometimes made and do deserve to be put on notice. Just that a lot of us appreciate people being generally mannerly.
HDDVD vrs BlueRay,,, is the war over yet????? NO! Oh well!
In therory, when a HDDVD/BRay player is connect to HDTV via HDMI the two will do a "handshake" routine and then agree on the "best" format that both can support, what ever that is. John
Actualy it does
Most, if not all, HD DVDs are mastered in 1080P. The output, for HD DVDs can be either 720P or 1080i. If you have a 1080P TV the 1080i input will be de-interlaced to 1080P in the TV.
The Samsung Blu Ray DVD player converts the 1080P on the disc to 1080i, then converts the 1080i to 1080P. If you us the 1080i or 1080P output of the Samsung into a 1080P TV the picture will look the same.
It may seem odd to convert a 1080P to 1080i, just to convert it back to 1080P. I can't find confirmation of this, but I believe that the HD DVD and Blu Ray DVDs, that are mastered from film, are mastered in 1080P 24 frames per second. 24 FPS is the standard for film. The DVD player then converts the 1080P 24 FPS to 1080i 60 FPS, 60 FPS is the standard for video and TVs. Because 60 is not a multiple of 24 the conversion requires a process called 3:2 pull down. I don't understand the exact process, but in general, the signal is adjusted to keep the signal at the proper speed. There are some TVs that can use different refresh rates, but most are fixed at 60.
WRONG! Only from component or composite inputs......
The Blue Ray only down converts when hooked up via analogue inputs such as component and composite. Using the HDMI, you get a pure 1080p signal without compression or interlacing. ALWAYS go HDMI! Either direct from the player to the HDTV or thru a 1080p/pass-thru receiver such as the Onkyo TX-SR674, Yamaha RXV 1700 & 2700, Denon 2307 & 2807.
mr breslin happy holiday where have you been
Stewee here I finally got my Sony blue-ray player fired up BAM BAM used the h.d.m.i. cable to my 72" toshiba d.l.p. set and optical to amp Fact is there is a difference in picture quality between 1080I and P but I really dont think you are going to see the difference on smaller screen sizes If someone has a good front projector or huge rear projection t.v. then I would say go for it Another thing about this player is it does not upconvert standard d.v.d. movies as my old Denon 2910/955 So had to stack it on top of the blue ray unit and now floor is starting to sag Sure have missed your fine postes where have you been? Marry Xmas to you and family stewart norrie
We have been traveling and I have not had time to check in. I had this thread set to notify me if there was any activity, so I got an email when you made your post.
Merry Christmas to you and your family.
You are so right!
I love that you actually look at the picture and have some understanding of the #'s and the I vs P! It's very confusing to most consumers. So...I.d like to have the new TV tech but! I'm very detial oreinted and don't like the so called artifacts, stairsteping, screen door,loss of detail, and the worse being bluring and in/out of focus during action/sports scenes!!!
Buying a new tv, do i really need 1080p
I KNOW THAT MANY OF THE INDIVIDUALS THAT HAVE RESPONDED ARE GIVING YOU ACCURATE AND HONEST INFORMATION.
HOWEVER, I HAVE SPENT A FORTUNE ON VARIOUS CIDEO RECORDERS, TELEVISIONS, SOUND SYSTEMS, CAMERAS AND MY LIST COULD CO ON AND ON, BUT IT IS MY EXPERIENCE THAT YOU "SHOULD" BUY WHAT EVER YOU WISH AND FORGET THE SPECS.
I KNOW THAT THIS WILL IRE SOME OF THE READERS BUT IF YOU FIND A TELEVISION YOU LIKE BY IT AND FORGET IT!
JUST A THOUGHT,
P.S. ALL OF THIS TECHNOLOGY IS CRAP AND AS SOON AS YOU GET YOUR PENULTIMATE SYSTEM I SWEAR THE NEXT DAY THEY WILL COME OUT WITH SOMETHING BETTER.
LASTLY, WITH PANAS0NIC'S NEW 103" 1080P IT ONLY TAKES ABOUT $25,000.00 ADDITIONAL FUNDS FOR ALL OF THE JUNK TO MAKE IT LOOK RIGHT.
For any screen size over 65" I would just get a projector...
Anyways the way I see it is that eventually BR or HD-DVD will become more affordable and more standardized. If you are buying a new TV now, why not get one that supports 1080p HDMI input so you won't have to get a new TV a few years down the road to enjoy more abundant 1080p content. Also if you are buying a PS3 you better have a 1080p TV to get your money's worth out of Sony's super expensive console.
The KDS series from Sony are the best 1080p sets out there and they aren't THAT expensive (~$2500 for the 50").
I like the LCoS tech better because plasmas burn-in way too easy while playing video games.
reply @drmet and @bearvp
The other suggestions are nice and i agree with your post, but i believe the important issue is knowing where the line is. I'm more versed in PCs, so i'll use that example. Ppl who know very ltitle about PCs, about to buy their 1st one may be VERY lost. I've actually met ppl on OL forums who couldn't decide which Alienware PC to get for his home business when a Dell Dimension 4600 at only $400 would've been more than sufficient. Or ppl who buy ipods b/c it looks neat, but when in fact, they were better off going against the crowd and getting another brand MP3 player instead. Once we make sure these Question askers know what they really need/want, then general info and suggestions would be sufficient on the grounds that in end, the end result is the same.
From what i heard, PS3 won't support 1080p right away due to the learning curve for programmers/developers to fully utilize its full HD potential. Loose speculation is give this res a year or 2 to fully come into play
Kinda right, kinda wrong
The PS3 will support BD right out the box, so you will get 1080p movie goodness out the box. That should justify a 1080p set purchase. Games on the other hand are a different story and go along with what you said. Metal Gear Solid 4 will be in 1080p and the producer Hideo Kojima has said so. THAT ALONE IS WORTH $600! I don't care about the other games LOL.
1080P for use with home theater PC
Lots of great information in the replies about the technical specifications, scaling and such.
The MAIN reason to go with a progressive scan TV at high resolutions is for connecting with your PC through DVI output or through video games with HD output.
Although one must consider the Kell Factor as well. 1080i looks worse then 720P due to detail loss and flicker filtering.
Also, the closer you sit to the television the higher resolution and DPI you are going to want.
So if it is meant only for a TV, to be viewed from across the room, then no 1080P is not necessary yet. (though recommended)
If you intend to play HD movies from your PC (BluRay or HDDVD player also) or HD games from PC or Xbox, (Or have a small living room)then go with high res progressive.
I think you can see an improvement with 1080p
When I was looking for an HDTV, I went to a store that had about 30 or so HDTVs of varying resolution (720p, 1080i, 1080p) up on display. As I casually walked by checking them out (Sonys, Samsungs, Toshibas, etc) I was able, every time, to tell I was looking at a 1080p set before I read what resolution the set had. As long as an HDTV channel was displayed (cable tv coming in at 1080i) I could tell every time. There was a discernable qualitative improvement--it was not huge and I am no expert but I really could see it. It was of course, more noticable when the sets were larger. All the sets were sharing the same cabletv signal.
So, I bought a 56" 1080p set. No regrets.
1080p is going to be available on the PS3, and as Blu-Ray and even upconverting gets more available 1080p will be the way to go.
If you can afford it, definitely do it. I've seen it and it looks distinctly better than 1080i, particularly with footage containing a lot of movement.
Difference between 1080i, and 1080p
It all depends really on what type of picture you want. to get full high definition I would seriously recommend 1080p. P standing for progressive, meaning you are going to get a constant state picture, where as 1080i is interlaced. Just as good, but the t.v has to do more work to give you that HD picture. Granted with both you will be receiving 1920 x 1080 pixels of resolution, but if you want the best in HD go with 1080p. You won't be disappointed.
1080i vs 1080p Clarification.
Actually the TV, no matter what display type, has to work twice as hard to produce 1080p. You're scanning twice as many lines in the same 1/60 sec so the bandwidth of all video amplifiers must be doubled. If we were talking about a CRT type display, the horizontal scanning freq. would have to be doubled also. Much more power required.
Screen sizes 60" and larger
1080p is an absolute must if you go to 60" or larger viewing. And even if a broadcast is in 1080i, the 1080p set will DE-INTERLACE & display it to give that huge picture a smoother HDTV look.