If you're looking to connect a computer to a TV or monitor, your choices are HDMI, DisplayPort, DVI, and VGA. What's the best connection?
With televisions, HDMI is the most common connector. But if you want to connect a computer to your TV (or you've got a new computer monitor), the options tend to be HDMI, DisplayPort, DVI, and sometimes old-school VGA.
Each connection has its pros and cons, and perhaps the best cable to use with your display is more than just "what it came with."
Here are the differences.
Before we start, it's important to note that with the exception of VGA, all the other connections here are digital. So while the pixel resolution potentials vary with each connection, the quality otherwise does not. As in, 1,920x1,080/60 over HDMI is going to look the same as 1,920x1,080/60 over DVI and DisplayPort (assuming all other settings are the same). The logical extension of this is that the cables themselves also don't make a difference, in terms of picture "quality." Any cable capable of a specific resolution is either going to work over a certain distance, or not work. Check out my article "All HDMI cables are the same" for why this is.
All TVs and most computer monitors have HDMI. It's easy to use, the cables are cheap, and best of all, it carries audio. If you're plugging your computer into a TV, your first choice should be HDMI. It will save you lots of hassle.
HDMI has limitations, though, and isn't always the perfect choice. For example, your TV likely has HDMI 1.4 connections, which max out at 3,820x2,160-pixel resolution at 30 frames per second. If you've gotten a new 4K monitor, you're limited to 30fps. Not until HDMI 2.0 will you be able to do 4K over HDMI at 60fps. You'll also need new hardware (and probably a new TV).
So in most cases HDMI is fine, but for really high resolutions and frame rates, one of these other options might be better.
DisplayPort is a computer connection format. There is only one television with DisplayPort, and don't expect it to see much further adoption on the TV side. It's capable of 3,840x2,160-pixel resolution at 60fps, if you have at least DisplayPort 1.2 and the Multi-Stream Transport feature. If you're looking to connect a computer to a monitor, there's no reason not to use DisplayPort. The cables are roughly the same price as HDMI.
DisplayPort can also carry audio.
The video signal over DVI is basically the same as HDMI. The maximum resolution potential depends on the equipment, though. Some cables and hardware (called single-link) can only do 1,920x1,200, while others (dual-link) can do more.
DVI generally doesn't do audio (it varies). So if you're using a TV, use HDMI. Since computer monitors don't usually have speakers, this isn't an issue.
VGA (aka PC-RGB, D-sub 15)
The old-school VGA connector is a cable of last resort. It's not too common anymore, and hardly ever found on TVs. A recent e-mail asked about it, so I'm including it.
Don't use VGA, not if you can help it. While it is capable of fairly high resolutions and frame rates, it's an analog signal. You're not likely to get a pixel-perfect image with today's LCD monitors (hence why you'd use DVI).
What about Thunderbolt?
The Intel/Apple love child of Thunderbolt is technically only available on one monitor (Apple Thunderbolt Display). There are likely to be more, but don't expect some sort of Thunderbolt revolution. The connection is compatible with Mini DisplayPort.
You can convert some of these cables into others. For example, DVI and HDMI are generally convertible using a simple adapter. Some DisplayPort connections will also work with DVI and HDMI with an adapter, but not all.
All modern televisions will convert the incoming signal to whatever their "native resolution" is. For most TVs, this is 1,920x1,080 pixels. So if you send a TV 1,280x720-pixel-resolution material, it will upconvert that to 1,920x1,080. TVs tend to be pretty good with this (though they won't accept every resolution; check your owner's manual for which ones). However, you're better off setting your computer's resolution to be the same as the TV's (presuming it doesn't set itself automatically, as it should). Matching resolutions mean pixel-for-pixel accuracy and no upconversion blurring or artifacts. This is especially true for computer monitors, which rarely have the quality converting processing that their TV cousins do. Send a computer monitor a non-native resolution, and it will work...but it's not going to look as good as it should.
Check out "What is upconverting?" for more info.
OK, so, generally, HDMI is fine. If you're using a really high-resolution monitor, go DisplayPort. Otherwise the options all start having serious drawbacks. If you're connecting a PC to a TV, check out this post on how to use your TV as a computer monitor for gaming, videos, and more.
Lastly, the one tricky factor in all this is that not all your equipment might support the native resolution you want to send. With TVs this isn't likely a problem as nearly all are 1,920x1,080, but with monitors and their more varied native resolutions, it's a little trickier. Check your owner's manual to verify what your monitor's native resolution is (always send the native res, when possible), and to make sure it's capable of accepting that resolution with the cable you want to use.
Got a question for Geoff? First, check out all the other articles he's written on topics such as why all HDMI cables are the same, LED LCD vs. OLED, why 4K TVs aren't worth it and more. Still have a question? Tweet at him@TechWriterGeoff then check out his travel photography on Instagram. He also thinks you should check out his sci-fi novel and its sequel.