It's an easy sell: Four times the resolution of HDTV. The numbers are a cinch to understand, 2160p easily identifiable as more than 1080p. Every manufacturer is pushing their top-of-the-line 4K TVs as the next generation of superlative picture quality.
But resolution is just one aspect of picture quality. Is it possible that by focusing so intently on this one, easily marketed, improvement, that other aspects are getting neglected, or worse, diminished, at least in the short term?
The short answer? Yep. Here's how.
Yes, Emperor, your new duds are exceptionally high rez
CES there was talk of little else from TV makers., has four times the resolution of HD: 3,840x2,160 vs. 1,920x1,080. We first started hearing the murmurs of 4K a few years ago, and at this year's
myself and others, in side-by-side comparison reviews and calculators based on human visual acuity, have pointed out that in the TV sizes most people buy, and at the distances most people sit, you'd be hard pressed to see a difference between 4K and 1080p (all else being equal). On the other hand, though, we were careful to say that it didn't matter in terms of market adoption. The move from HD to Ultra HD is inevitable.,
In certain cases, like computer monitors (where you're sitting closer), projectors (where the image is huge), and truly massive TVs (ditto), the higher resolution of 4K is highly beneficial. In a 42-inch TV from 10 feet away...not so much.
OK, so maybe the benefit of this higher resolution is small to nonexistent, but how does 4K UHD actually impair image quality? Glad you asked.
The first issue: LCD
Right now, all the 4K TVs available are LCDs. While LCD technology, overall its picture quality is not as good as plasma or OLED. So the tradeoff is higher resolution for lower contrast ratios, motion blur, and poor off-axis viewing.
But it's actually worse than that. 4K killed plasma. It's not dead yet, but it's getting there. One of the reasons Panasonic reported to be the difficulty of making a cost- (and likely, energy-) efficient 4K plasma. Samsung told David Katzmaier , though they're still making plasmas (this year at least)., was
So picture quality fanatics lost some of the best-looking televisions ever, because the industry wanted to move to a higher resolution. This is sort of like saying "All new vehicles must have four wheels, because four are better than two!" And like that, every motorcycle enthusiast is left out in the cold.
And...it gets even worse. Where's OLED, which produces an even better picture? After a big splash last year, with , both companies have been rather silent. Some were shown at CES, but when will they actually be released?
This is, at least partially, a rhetorical question. It's meant to point out that OLEDs are currently very expensive to produce. LCDs, in comparison, is much less expensive. This is only exacerbated by the need to quadruple the pixel count to make a 4K OLED TV.
So is 4K making OLED development more difficult? Certainly. Is that why we're not seeing more OLED TVs in 2014? Maybe, maybe not. Samsung and LG are practicing the delicate art of silence.
The second issue: Bandwidth
There's more to it, unfortunately. A 4K signal is much larger than a 1080p signal. There are two choices: increase the size of the bandwidth pipe, or increase the compression so the signal fits in the current pipe (or both).
Increasing the size of the pipe isn't that likely. In the US, we have some of the slowest average broadband speeds in the developed world. While there's some movement toward greater speeds (FioS, Google Fiber), 50mpbs averages aren't going to happen for the majority of people any time soon. Most cable and satellite HD signals are already compressed to rubbish, so what makes anyone think that 4K over the same services will look any better?
Then there's the compression. H.265/HEVC is the newest codec, and it promises to offer much better compression than current codecs. Better compression doesn't have to mean more artifacts, but once again this is a massive signal getting squeezed down some pretty small pipes. Just because H.265 looks good in tests, doesn't mean it will continue to look good once cable/sat providers get ahold of it. After all, the codecs used now can look good, but often don't.
We haven't seen any streaming 4K in action yet in the real world, but we agree with Chris Heinonen's best guess, given the 15Mbps 4K streams mentioned by Netflix at CES: it will probably look worse than 1080p Blu-ray.
The takeaway here is something most people fail to consider: a bad 4K signal can look worse than a good 1080p signal. Easily. Just look at all the "HD" content on TV that, after over-aggressive compression, looks worse than a good DVD.
Yes, eventually bandwidth and compression will improve to the point where 4K streams look sharper than today's Blu-rays, provided you sit close enough to a large enough screen. But that probably won't happen anytime soon.
We all want better picture quality. As I laid out in my article,, there are other aspects to the picture more important than resolution. How much these (and other) aspects are getting pushed aside in the crusade to 4K is fairly clear. We're losing plasma and its picture quality, OLED could be further delayed, and higher compression could negate whatever benefit the higher resolution has to begin with.
Is there an easy answer? Nope. Ultra HD is coming whether we want it or not. It's just sad that so many more beneficial improvements are getting slammed aside just because of one, less important, easily marketed, "improvement."
Got a question for Geoff? First, check out all the other articles he's written on topics like why all HDMI cables are the same, LED LCD vs. plasma, active versus passive 3D, and more. Still have a question? Send him an e-mail! He won't tell you what TV to buy, but he might use your letter in a future article. You can also send him a message on Twitter @TechWriterGeoff or Google+.