Ultra HD, colloquially known as "4K," is the latest buzzword, and the latest push from TV manufacturers.
While your next TV might not be Ultra HD, the one after probably will be.
Here are the basics of what you need to know about this latest advancement in TV technology.
Higher than HD resolution, and possibly more
To put it most simply, Ultra HD is resolution greater than HD. Today this most commonly means a horizontal resolution of 3,840 and a vertical resolution of 2,160. This is four times the resolution of 1080p, which is 1,920x1,080. Officially, Ultra HD is a minimum of 3,820x2,160, also known as Quad HD. It also includes cinema 4K (4,096x2,160) and future resolutions like "8K," or 7,680x 4,320.
With current products and content, Ultra HD is almost entirely just about this increase in resolution. Resolution is just one part of a good picture, however, and not the most important. Behind the scenes there's movement towards a new standard, called Rec 2020. This improves other aspects of the image, like color and frame rate. There are no TVs that support this proposed standard yet, but we shall see.
Can you even see the difference?
Probably not. There's only so much detail the human eye can resolve. If you have 20/20 vision (common), sit about 10 feet from your TV (also common), and are buying a typical TV (50-inches or so), you're not going to see the additional resolution. Check out Four 4K TV facts you must know for more info, or check out Chris Heinonen's excellent 4K Calculator to see if you can benefit from the extra resolution.
If you sit closer, or plan on getting a big TV or projector (80+ inches), then 4K becomes much more worthwhile. For example, last year I checked out Samsung's 85-inch S9, and it was gorgeous when fed the highest-quality 4K demo content.
"Ultra HD" vs "4K"
The official moniker for this new resolution is Ultra HD. However, it's commonly referred to as "4K" or even "2160p." 4K is the cinema standard that deals with a similar resolution (generally 4,096x2160).
Most people, myself included, would rather just call it 4K. Yes, this isn't strictly accurate, but I'm not nearly pedantic enough to care. Some are. Also, 4K is easier to type and say. We did a poll, and most people agree with me.
For the record the Consumer Electronics Association, the closest the U.S. TV industry has to an authority in this matter, sees "4K Ultra HD" as a "legitimate use" in line with its guidelines. That catchall term, or some variation thereof, appears to be what most TV makers are using, at least for now.
Your current HDMI cables can probably pass 4K. There is a new HDMI standard coming out, called HDMI 2.0. The main difference between HDMI 1.4 (which is what most equipment is now) and 2.0 is an increase in the possible framerates of the 4K signal. HDMI 1.4 can do Ultra HD at 30 frames per second. HDMI 2.0 can do 60 fps.
There are no new cables with HDMI 2.0, this is a hardware change, not a cable change. So your current High Speed HDMI cables should work just fine.
However, check carefully that any 4K TV you're considering has full HDMI 2.0 compliance. If it doesn't, you're not "future-proofing" yourself as much as you may think. Nearly all of the 2014 4K TVs announced at CES comply with HDMI 2.0, but it's a stickier situation with 2013 models.
There's also a new form of content protection coming, called HDCP 2.2. While not a big deal right now, if your 4K TV doesn't have it, you might not be able to watch all the 4K content down the road.
While the march of technology, um, marches on, some things get left behind. One of the biggest casualties is plasma. Panasonic, which recently pulled out of plasma production, listed the difficulty in making Ultra HD plasmas as one of the reasons they abandoned the technology. Samsung echoed this reasoning.
Could OLED, which faces similar challenges, be another casualty? Let's hope not.
Like it or not, this is happening. Ultra HD displays are the future. Prices are already falling from the major manufacturers, and lesser known brands, primarily Chinese names like TCL and Seiki, are already selling ultra-cheap Ultra HD.
A big issue right now, like the early days of HD, is the lack of content. While Netflix is claiming they'll start streaming certain shows in Ultra HD this year, and if you buy a Sony TV, you can get a 4K media streamer, there's very little to watch. The new consoles (PS4 and Xbox One), can't do Ultra HD gaming, though a reasonably powerful PC can.
There is content in the pipeline, though. The major movie studios have been converting their archives to 4K (and greater) for several years. Ultra HD Blu-ray is still just rumors, but is likely to happen. The new H.265 codec promises better compression, so Ultra HD can be had at reasonable bitrates. And since many 2014 4K TVs have H.265/HEVC encoding built-in, industry insiders are picking streaming to be the first widespread 4K delivery mechanism. As for actual broadcast or cable TV channels in 4K? That'll be awhile.
So content is an issue now, but it won't always be. It's just something to keep in mind if you feel you must get a 4K TV now: there's not much to take full advantage of it.
What does all that mean for you? Right now, Ultra HD televisions are impressive up close, but not a vital purchase. It's likely that in a few years, we'll get better, cheaper, and more capable Ultra HD TVs. Ideally, with a lot more 4K stuff to watch on them.
For a more in-depth look at the history and more of the issues surrounding 4K, check out What is 4K UHD? Next-generation resolution explained.
Got a question for Geoff? First, check out all the other articles he's written on topics like why all HDMI cables are the same, LED LCD vs. plasma, active versus passive 3D, and more. Still have a question? Send him an e-mail! He won't tell you what TV to buy, but he might use your letter in a future article. You can also send him a message on Twitter @TechWriterGeoff or Google+.