If you're definitely going to get HD cable/sat, and/or can live without over-the-air broadcasts, there's nothing wrong with HD monitors. I'm using mine to type this message right now...
HDTVs have more pixels than analog TVs. That's pretty obvious. When you watch an SD source on an HDTV, the image is stretched to fill the screen and use all the pixels. The stretching process, which is called interpolation, is what causes people to complain. That, and the raw detail of an HD source, which in comparison makes SD look blurry (even on a 100% digital SD source). If you watch some channels in HD and some in SD, it's very easy to notice how bad the SD channels look, but they really just look bad in comparison.
An SD source on a SD TV has one pixel in the signal for each pixel of the TV. The same is generally true for HDTV, though ABC, FOX, etc use 720p signals (for now). ANYWAY, when you have an SD source on an HDTV, that one pixel now is stretched two or three times, and also some of the rendered pixels are given an intermediate value between two signal pixels (I hope that makes sense) to try to blend the colors together, attempting to smooth the transition. Anyway, THAT'S interpolation. The stretching. And this is done internally by the TV, so how it looks can vary from one set to another, but none of them do it particularly well because the number of pixels across and down of HDTVs aren't whole-number-multiples of an SD set.
Okay, enough words.... look:
SD res is 480 vertical lines
HD res is either 720 or 1080 vertical lines
480 x 2 = 960, so in order to fill a 1080-line TV, some pixels are dowbled, and some of the pixels have to stretched to 3 pixels, blurred, blended, etc. On a 720-line set, the pixels have to be stretched ~1.5 times. This makes it look inherently blurry.
If you have an LCD computer monitor, you can see interpolation at work when you use a display resolution lower than the "native resolution" (the actual number of pixels physically in the monitor). For example, if the monitor has a maximum resolution of 1280x1024, and you change the computer's res to 800x600, the screen will look blurry because the 800 pixels in the signal need to be stretched to fill the 1280 pixels of the monitor. When you watch a DVD on your computer this will happen too. If you watch the movie in window, it'll look okay, but when you go fullscreen, it'll look blurry in comparison.
Even if the HD formats had 960 lines, and no blurring was used in the interpolation, people would complain that SD sources look blocky and pixelated, sort of like when you print a web page, it never looks as good as the computer monitor. The reason why the printout looks worse is that the printer is capable of finer detail than the monitor. Again, it's not that the source material is bad, or that the printer makes it look bad, but rather that the printer reveals the lack of detail.
Anyway, bottom line, it's not that TV makes the SD source look bad, it's just that the signal can never look good because it has to be interpolated. Okay, maybe that's the same thing. I guess it's more accurate to say that an SD signal will never look as good as an HD signal because of interpolation.
Lol, does that make any sense?