Almost every video product these days will convert standard-definition signals to HD, but how?
Geoffrey Morrison is a writer/photographer about tech and travel for CNET, The New York Times, and other web and print publications. He's also the Editor-at-Large for The Wirecutter. He has written for Sound&Vision magazine, Home Theater magazine, and was the Editor-in-Chief of Home Entertainment magazine. He is NIST and ISF trained, and has a degree in Television/Radio from Ithaca College. His bestselling novel, Undersea, and its sequel, Undersea Atrophia, are available in paperback and digitally on Amazon. He spends most of the year as a digital nomad, living and working while traveling around the world. You can follow his travels at BaldNomad.com and on his YouTube channel.
Your HDTV, HD cable box, Blu-ray player, and even most DVD players and receivers will "upconvert," or scale, a standard-definition image to fill the screen of an HDTV.
Despite the marketing hype, this doesn't make SD look like HD, but it can make it look better than regular SD. The better the source, the better your TV will look. So how well a piece of gear can make standard definition appear is one of the major performance differences between two products. Recent TVs and Blu-ray players have gotten quite good at this.
The irony is, as good as scalers or upconverters have gotten, they're becoming obsolete.
Let's take a look at the problem. An SD image, and we've got more than 60 years of them, has a resolution of at best 640x480 pixels. Old-school analog SD isn't actually in pixels (the building blocks of a digital image), but for ease of understanding let's use these numbers, as they're roughly comparable. Most HDTVs these days are 1080p, meaning they have a resolution of 1,920x1,080 pixels.
If you were to display an SD image on an HDTV pixel for pixel, it would look something like this:
For the record, DVD has slightly more horizontal resolution, but it's not consequential in relation to what we're talking about here.
Now, people don't want to watch a postage-stamp-size image on their shiny new HDTV, so the TV will scale the image to full size. So you get a full-size image, more or less.
The problem, as you can see, is twofold. The first is that the square image doesn't fill the rectangular HDTV screen. This can only be fixed by adjusting the aspect ratio control on your TV. I don't recommend this as you'll get a stretched image, lose information at the top and bottom of the image, or both.
The other problem, and the one we're talking about here, is that the TV has to create new pixels in order for the original image to fit the screen. No matter what, it's going to be softer than a true HD image.
How well a TV creates this new information varies from set to set. All TVs do it, some better than others.
Faced with a limited resolution to start with, the upconverter in the TV has to "guess" to create detail. At its core, detail is the transition from one color to another. The transition from white to black that lets you see the words on this page is a good example of fine detail at its most basic. If we zoom in on something simple like that, we can see the challenge upconverters have.
Here is a black line, in its original form, as seen in its native resolution. I've zoomed in so you can see the pixels that make up the line:
Now, if you were to do a simple upconversion on this image, and show it on an HDTV, it might look something like this:
As you can see, more pixels are used to create the same image. There are also more pixels used for the transition from black to white. The problem is, those extra "transition pixels" when viewed at a distance make the image appear soft. The upconverter's job is to minimize this softness, perhaps giving something like this:
Better, right? Still not perfect, though. Had this line been created as an HD image originally, it would have perfect black-to-white transitions.
Should you upconvert your upconversion?
Upconverting has gotten better and better as the processors in televisions have gotten more powerful. Most TVs do at least a passable job these days. The reality is, it matters less and less because the upconversion done in most Blu-ray players is excellent, often better than what's done in a television. Playing a DVD, all of which are SD, on a Blu-ray player is a best-case scenario for watching SD on an HDTV.
If you're watching the SD channels from your cable provider, your cable/satellite box is upconverting these channels before the content is even sent to your television. The upconverters in cable/satellite boxes, unfortunately, are almost universally terrible. If you can set your box to output the channel in its native resolution, it's worth seeing if your TV does a better job upconverting it. Few cable/satellite boxes let you do this, though. Most just force you pick one resolution for everything. My recommendation in this case is to leave it set to output 1080i. At least then you'll get 1080i content (CBS, NBC, HBO, Discovery) at 1080i. If you have a 720p TV, you can experiment with setting the output of the cable/satellite box to 720p so you'll get that content at its native resolution (ABC, Fox, ESPN). Likely the difference will be minimal, and SD won't look great either way.
Dealing with 1080i brings in an entirely new issue: deinterlacing. A 1080i signal is interlaced, as in every sixtieth of a second the TV receives a 1,920x540-pixel image (the odd lines of the image), and the following sixtieth of a second it gets a slightly different 1,920x540-pixel image (the even lines of the image). The TV weaves these together to create the full 1,920x1,080-pixel-resolution picture. How well a TV combines these "fields" into "frames" is another performance differentiator. For more information, check out 1080i and 1080p are the same resolution.
You could also skip cable and satellite altogether and get the source free and direct with HD over the air, but that too is a different article.
There will always be signals of different resolutions, so scaling or upconverting will always be important. That said, again, it's rare that the television's performance in this regard matters much anymore. Most TVs these days get sent HD content, and not native SD (other than a Wii). It is an important factor in Blu-ray players, if you watch DVDs or SD streaming content from Netflix and the like. Most current Blu-ray players do a pretty good job of upconverting, though, and while some are still better than others the difference isn't huge.
If you want to check how good the upconverter/scaler is in your TV or Blu-ray player, you'll need some test patterns. The cheapest way to find some decent test patterns is to get a disc like the Spears & Munsil High-Definition Benchmark. You'll also be able to make sure all your TV settings are set correctly with this same disc. And you'll need to make sure your Blu-ray player is hooked up with either component or HDMI cables. The single yellow cable isn't HD.