What is HEVC? High Efficiency Video Coding, H.265, and 4K compression explained
High Efficiency Video Coding (HEVC), also known as H.265, promises twice the compression possible with Blu-ray’s best video compression methods. But how does it work, and is it enough to get us better-looking 4K content?
I'd like to call it H.265, because it sounds cool, but it's full name is High Efficiency Video Coding (HEVC). It's the new successor to Advanced Video Coding (AVC), also known as H.264, which is one of the compression schemes used by Blu-ray.
The idea of HEVC is to offer the same level of picture quality as AVC, but with better compression, so there's less data to deal with. This is key if we want 4K/Ultra HD broadcasts (including satellite), 4K Blu-rays, and more.
But is it enough, and for that matter, how does it work?
Compression (the good, the bad, and the lossy)
The amount of raw data coming out the back of a professional HD camera is a massive. There's no way to conveniently transport it to your home. Instead, the video is compressed to reduce the amount of data into a more manageable form.
There are many ways to do this, one of the easiest being reducing the quality. In some cases this is OK. Think of your average YouTube video. Not great, right? Often that's because the video is highly compressed (either before or during the upload). Heavy compression might keep the resolution technically the same, but the image can appear softer, noisier, or have weird distracting artifacts (like those seen to the right).
But that's not a great idea if the point is to preserve a director's intent, or show off your new 77-inch OLED.
So the other option is to use better compression. In this case, you can basically think of "better" compression as "smarter" compression. So it's taking the same original (the video), and finding out better ways to make the amount of data less, without sacrificing quality. Every few years the processing power of gear has improved enough to let more processor-intensive compression algorithms to be used, and further compress the data without making the image worse.
This distinction between "more" compression and "better" compression is important, as really, the terms aren't interchangeable in this context. You can decrease the amount of data required for a signal either by cranking up the compression and making the image ugly (just "more" compression), or using a more efficient compression technique ("better" compression).
Let me put it this way. Say you have a bushel of apples. You need to fit 100 apples inside. You can do it with more compression (reducing the apples to sauce), or with better compression (finding a better way to make them all fit, but preserving their appleness).
More compression: applesauce
Better compression: more apples, same space
As you can see from this delicious example, "more" compression is easy (SMUSH) while "better" compression requires more thought and/or better technology.
As data-intensive as HD is, 4K is even worse. While most of us were just getting used to the idea of H.264's advantages over MPEG-2 on Blu-ray, the Motion Picture Experts Group and the International Telecommunication Union's Telecommunication Standardization Sector (ITU-T) were already starting work on the next generation of video compression, with an eye on the future.
Not wanting to mess around with small, incremental improvements, whenever a new compression standard is introduced, it has to be a sizable change. With each jump, the general rule is half the bit rate for the same quality (or greater quality at the same bit rate).
How does it do this? Largely by expanding on how AVC (and other compression techniques before it) works.
First, it looks at multiple frames to see what doesn't change. In most scenes in a TV show or movie, the vast majority of the frame doesn't change much. Think of a scene with someone talking. The shot is mostly their head. The background isn't going to change much for many frames. For that matter, most of the pixels representing their face probably won't change much (other than their lips, of course). So instead of encoding every pixel from every frame, an initial frame is encoded, and then after that only what changes is encoded (basically).
HEVC then expands the size of the area that's looked at for these changes. Larger and smaller "blocks" essentially, which offers additional efficiency. Ever seen blocks in your image, when the picture goes foul? Those can be bigger, smaller, and differently shaped with HEVC than with previous compression methods. Larger blocks, for example, were found to be more efficient.
Then other things were improved, like motion compensation, spatial prediction, and so on. All of these things would have been done with AVC or even earlier, but it required more processing power than was economically feasible at the time.
During the development phase, the compression algorithm is tested objectively, for its raw number efficiency, but also subjectively, by video professionals comparing different compression methods and amounts in a "blind" test, where they don't know which method is which. The human element is crucial. Just because a computer says one level of compression is better than another doesn't mean it looks better than another.
Because H.265 is so much more processor intensive, don't expect a simple firmware upgrade to get your gear to decode it. In fact, that's part of the issue. You need a hardware decoder somewhere. If your new media streamer, cable box, or BD player has it, then you'll be all set (presuming you also have HDMI 2.0 so you can get 2160p/60 and not just 2160p/30). Could a high-end PC decode it via software? Maybe. Could the Xbox One or PS4? Not likely. Everyone loves their favorite console, but remember, this generation's hardware is equivalent to a pretty average PC.
Will it be enough?
Well, technically yes, but with a big caveat. Like AVC (and other compression standards) before it, H.265 is adjustable, depending on the bandwidth needed. Want 4K over a mediocre Internet connection? No problem; crank up the "dial" (remember the applesauce?). Want the best picture quality? No problem; turn the dial the other way.
While this arrangement offers flexibility, it also means that "4K" and "UHD" won't necessarily guarantee better picture quality any more than "1080p" or "HD" do today. A highly compressed 4K signal could, in many ways, look worse than a less heavily compressed HD signal.
In other words, streaming 4K might look worse than current 1080p Blu-ray, depending on how much compression is used. With Netflix now streaming "House of Cards" in UltraHD at 15.6 Mbps, initial indications speak of 1080p Blu-ray looking "cleaner," confirming some expert predictions. The likely reason? 1080p Blu-ray has a lot more bandwidth to devote to video than online streaming, more than compensating for the discs' older compression scheme.
And while processing speed in all devices follows Moore's Law, Internet bandwidth does not. Sure there are pockets of true high-speed connections, but many people struggle to get a decent streaming HD signal. With the shaky ground Net Neutrality has in the US, the future of decent and cheap 4K streaming for the masses remains cloudy.
One other benefit
While most of HEVC's potential benefits are focused on 4K, its better compression provides benefits for HD, too. Lower bandwidth with HD means more people can get HD. People out in the sticks with connections too slow for current HD might be able to get HEVC-encoded HD. If you pay per megabyte (mobile or at home), lower bit rates mean cheaper HD viewing as well.
Start looking out for HEVC (or H.265) as a line item on TVs, Blu-ray players and other media players in the future. Nearly all major-brand 2014 4K TVs include the necessary hardware decoder, although 2013 4K TVs do not. There will also be more streamers like the Sony FMP-X10 that include the requisite hardware.
There were a lot of grumblings during the transition to H.264/AVC at the advent of Blu-ray, now its a given. The same will be true of HEVC, eventually. Lower data rates, while maintaining quality, are a good thing for everyone.
Special thanks to Broadcom's Rich Nelson for his help with some background info for this article (and the chart).
Got a question for Geoff? First, check out all the other articles he's written on topics like why all HDMI cables are the same, LED LCD vs. plasma, active versus passive 3D, and more. Still have a question? Send him an e-mail! He won't tell you what TV to buy, but he might use your letter in a future article. You can also send him a message on Twitter @TechWriterGeoff or Google+.