Regarding VGA vs. HDMI: not so fast with your judgements.
Analog VGA can provide pretty good picture quality, if all the components are up to it. Most integrated graphics subsystems in relatively new chipsets (several years back) should be good enough - provided that the motherboard vendor didn't add an overkill EMC filter just before the motherboard's DB15 analog VGA connector. Any recent add-in graphical card by NVidia or ATI should do a decent job there. And if you need an insanely long cable, you can make your own - just use some quality 75Ohm coax for the RGB lines. The one good thing about analog VGA is, that the DB15 VGA input on any flat-screen TV will take your signal as a "PC input", and will apply only the least amount of processing to it => no scaling, no insane color enhancements, no edge enhancement, no noise removal. That is, provided that you configure your graphics driver to produce the display's native resolution.
HDMI on the other hand, can be tricky. If you're shopping for 1:1 pixel mapping from your "computer graphics video memory" to the LCD TV's screen matrix, the HDMI input on your TV may not be the right way to go.
More on that here:
http://forums.cnet.com/7726-7590_102-5053436.html
Regarding supported resolutions:
any PC graphics hardware made in the recent years technically supports full HD resolutions - in hardware. With the right driver to configure your timing registers, you can configure pretty much any resolution that your imagination can come up with. Even the "divisible by 8" is not a problem anymore, as far as I can tell (it was a condition with some very old VGA hardware) - that said, there are screens which indeed have 1360x768, rather than 1366x768. For such screens, you should indeed use 1360x768 (= the screen's native resolution) to have all your thin lines perfect (perfect 1:1 mapping of pixels).
The question always is, what resolutions are supported by your graphics driver in the operating system of your choice. If you're a windows user, it doesn't matter that Linux+Xwindows allow you to do crazy tricks with the resolution. In MS Windows, you can select from a fixed set of pre-defined resolutions, supported by your graphical driver (even if you untag the box saying "show only resolutions supported by my monitor", which Windows learn via DDC/EDID). Especially cheaper/older graphics adapters come with drivers with a limited set of supported screen geometries. I believe I've seen a screenshot from some NVidia configurations screen, where you can configure everything down to sync timing etc. - but specifically in the case of NVidia, I'm pretty sure your TV's native resolution is supported out of the box, no need for tweaking. The same should apply to newer ATI drivers. Speaking of Intel, the king of onboard integrated graphics, their "desktop" drivers support a pretty wide portfolio of standard resolutions, and if you're not happy with those, you can still download Intel's "embedded" flavour of the driver, and define your own! (Define your own Xres/Yres, horizontal/vertical refresh and pixel clock, or take some stock video mode and adapt the timings a bit to make your monitor happier, or some such.)
Regarding resolutions in terms of dpi: in the good old days of Windows 3.1 on 14" CRT monitors at 640x480, the resolution was about 60-75 dpi. And I have to say that I still consider 75 dpi as an optimum resolution for any kind of computing. Yes, you can see pixel edges if you sit across the table from the display. And that's the way I like it. That way, my eyes don't "give up focusing" beyond the pixel size - and I can make use of information down to individual pixels.
Alas, the PC displays have quickly moved to higher dpi resolutions: by 1024x768 and 1280x1024, the norm was more like 90 dpi, and the modern-day high-res and widescreen-only displays are well over 100 dpi. The classic Windows interface with fixed-size system font is very little use at those resolutions. Consequently, the GUI design has moved on to bigger fonts, taking up more pixels - as a result, you have more pixels on your screen, but you cannot see an individual pixel anymore, and your effective screen space really hasn't grown...
That's why I've considered using an LCD TV instead of a PC monitor for some time. If you're shopping for the 75 dpi optium for "across the table" office use, a 32" display with the classic 1080p "full HD" resolution should make a pretty good monitor, on the conditions that
1) the color presentation is sane enough for your working needs, or can be adjusted
2) the brightness is low enough for watching from a close distance, or can be adjusted
3) the display can be persuaded to display the pixels 1:1, without futher "multimedia enhancements"
Note that the current LED TV's have a pretty good power consumption, compared to the last year's CCFL models. But on the other hand, cheap LED TV's may have crappier color space (gamut) compared to the previous CCFL model, e.g. the presentation of yellow shades.
Yet in the recent months, I've become aware of one more pitfall, common to all LCD's: the pixel does not have uniform color across its surface. Every pixel really consists of three rectangles, lit up by the three elementary colors (RGB). Thus, a grey line on the screen, single pixel wide, won't look solid uniform grey "across the table at 75 dpi". Instead, you'll see a "rainbow edge", or just three parallel elementary-colored lines
Maybe that's one of the reasons for ever higher dpi in the modern LCD displays.
On a good CRT with a "delta" style screen mask, the mask's dpi used to be higher than your video mode's DPI, so that the whole square of the pixel had a more or less uniform color, yet the pixel had a somewhat clearly defined surface. If you tried to squeeze the maximum resolution out of your CRT display, you ended up at a resolution where the VGA pixel size just about matched the delta mask's "triangle pitch" (or, better yet, one step back, to avoid smeared pixels).
With LCD's, taking one step back means scaling/oversampling, and the resulting image won't be as crisp as a corresponding native resolution would be. Theoretically you could use exactly half the resolution at each axis, but that's hardly practical, as it would shrink your available screen space to 1/4th of the native resolution...
Ahh well 