Typically, HDMI inputs on TV's are indeed counterintuitive and even *broken* if you attempt to use them as a "PC display". There are two main technical reasons (limitations by design): the inputs only accept "Consumer Electronics" resolutions, and the input signal is subject to "overscan" + further filtering no matter what.
The CE resolutions should be something like 1920x1080 or 1280x720, both of them have an interlaced and non-interlaced variety. The panel's native frame rate should be 50 or 60 Hz (which would correspond to interlaced video at that rate), non-interlaced video material should be 25 or 30 Hz typically. Anyway - let's abstract from all the frame rate conversion quirks you can see even on demo screens in shops... the first important point is: note that 1366x768, a native display resolution typical for "HD-ready" TV's, is typically NOT accepted on the HDMI input!
Okay - from there you may easily conclude: HD-ready is so terribly out these days, anybody would go full-HD anyway, certainly when shopping for a "PC display TV". Hehe - beware of gotcha #2: most TV's will indeed accept 1920x1080 from a PC via the HDMI input, but: the TV will overscan it! = will zoom in on the picture a bit (just a few per cent) and overlap=crop some pixels around the edge of your picture, at the same time scaling the visible picture by some non-integer multiplier. Next, the TV would likely throw in some "edge enhancement" filter for a good measure...
The unavoidable ovescanning is a relic of the old days, when the true edge of a PAL or NTSC analog signal would often carry snippets of digital data for service purposes, appearing as "digital garbage on the edges" if the picture ever gets displayed whole. For that reason, analog CRT TV's always did overscan a bit, and the scaling didn't harm picture quality very much, owing to the analog picture re-composition on the CRT screen. Well the modern LCD/Plasma TV's still do overscan even the HD signals received via HDMI, even signals at the native resolution (that is, if the TV allows you to spoonfeed it the native resolution).
I've read rumours that HD broadcasters actually counter that by *shrinking* the actual visible content in the picture... It's a crazy world. No way for you to get a true 1:1 full-HD image from the camera all the way to your TV. The picture will always be scaled back'n'forth several times.
Note that analog VGA DB15 inputs are considered "PC inputs" per definition, do support a much greater number of resolutions, and if you provide the TV's native display resolution, the TV performs no overscan and no filtering on the input video (perhaps some proper level of color conversion to match the Gamuts). Yes it's analog transmission - subject to noise and limited bandwidth. Then again, with modern semiconductors the noise should be below the cca -50 dB per color channel, and the RAMDAC's on modern VGA's typically have something on the order of 400 MHz maximum pixel clock - whereas 1920x1080 @ 60 Hz = cca 180 Mpix/s (considering some typical blank space around the visible region). The necessary analog bandwidth is theoretically even smaller (Mr. Nyquist would say one half the pixel clock), but even those 400 MHz are not a problem with modern silicon.
Practically, in most cases, I'd expect the bottleneck causing "horizontal pixel smear", ghosting of edges etc, to consist in an output EMC-compliance filter on the analog VGA output of your graphics card. This is a fairly simple RLC filter. You can improve your VGA picture quality by desoldering / shorting that RLC filter, at the expense of voiding the EMC compliance of your VGA card (potentially irritating the FCC or your respective national EM compliance regulator). Some (old) VGA cables are also pretty bad. I've also seen an early LCD TV years ago (from some cheap noname brand) that did smear pixels on the DB15 input for some technical cause of its own... - try before you buy.
If you still manage to find an LCD TV with *DVI* input, chances are that the DVI will behave much more sanely and support more resolutions than the "TV HDMI" input. Note that for DVI, 1920x1080 is likely over spec (162 MHz max.) - unless you massage your graphical card into some "reduced blanking" timings (which is possible) and you have the luck that your TV accepts that.
For the future, I'd be more optimistic if DisplayPort inputs start to appear on TV sets. DisplayPort comes definitely from the PC side of things, so it should make no sense to cripple such inputs by overcan+scaling+filtering.