Why your 4K TV is probably the only 4K converter you need
If you have a 4K TV, you don't need to use an upconverting Blu-ray player or receiver. Here's why.
Geoffrey Morrison is a writer/photographer about tech and travel for CNET, The New York Times, and other web and print publications. He's also the Editor-at-Large for The Wirecutter. He has written for Sound&Vision magazine, Home Theater magazine, and was the Editor-in-Chief of Home Entertainment magazine. He is NIST and ISF trained, and has a degree in Television/Radio from Ithaca College. His bestselling novel, Undersea, and its sequel, Undersea Atrophia, are available in paperback and digitally on Amazon. He spends most of the year as a digital nomad, living and working while traveling around the world. You can follow his travels at BaldNomad.com and on his YouTube channel.
OK, so you've got your spankin' new 4K TV. Do you also need a spankin' new Blu-ray player and/or an AV receiver that upconverts high-definition sources to 4K?
The truth is, no, you don't. Your TV will convert every signal you send it to 4K resolution.
That means all your current gear will play on a 4K TV, assuming you can connect it, regardless of how low-quality it is. Even your old VHS deck will work. Spoiler: it's going to look terrible.
To look its best, of course, you should feed your 4K TV actual 4K resolution TV shows and movies, which may well require buying new gear. Just don't think you need that gear to handle the conversion process, too.
Here's what you need to know about how your 4K TV handles all those other, non-4K resolutions. The short version? It usually does a pretty good job, and if you see a problem, blame the source before you blame the TV.
Native resolution on your 4K TV
have a specific number of pixels, or "picture elements." These are the tiny dots (usually squares) that make up the screen. If you look really close, you can see them. To display a picture, the TV has to assign a color and brightness to every one of those pixels.
In the case of 4K TVs there are 3,840 of these pixels across, and 2,160 vertically, for a "native resolution" of 3,840x2,160.
If you send the TV 4K content, this, too, is 3,840 by 2,160. So the TV has very little work to do. For every pixel it has, there's a corresponding pixel in the content. If you choose the correct aspect ratio, the TV displays these 1:1, matching the source resolution and the native resolution without having to do any conversion.
But if you send the TV something other than 4K content, the TV must make the conversion. If it didn't, your Blu-ray (which is 1,920x1,080) would be a small box in the center of the screen, surrounded by black. Most of the pixels would go unused.
And most everything you'll watch today isn't in 4K.
How scaling works, 12 cookies at a time
For non-4K sources, like almost all broadcast TV, streaming video and games, the TV guesses (using fancy math) what the additional pixels should be so the image fills the screen. This process is called "upconversion" or "scaling." The TV takes the smaller incoming image and expands it to fill the screen, all on its own.
Generally, the process is called "upconverting" and the device that does it is called a "scaler." Conversely, you could also say the upconverter scales the image and I don't think anyone will freak out.
Believe it or not, modern TVs do a pretty good job of scaling, the proof being that most people have no idea it's even happening. The problem is, there's only so much a scaler can do.
If you're expanding good quality HD content, like from a Blu-ray, the resulting image will look pretty good. If you're starting with something of a much lower resolution, like a 720x480 standard-def cable channel, the TV has to create nearly 8 million new pixels. That's nearly 23 times as many pixels as are in the original image. Technology can only do so much, and that's basically creating something out of nothing.
Or to put it another way, if you have enough cookie dough to make 12 cookies, you can probably stretch it to make 14 and no one will notice. Try to stretch the same amount of dough to make 50 cookies, and people will start questioning your cookie-making skills. I mean, I guess it's still a cookie, but it's going to be a highly unsatisfying cookie experience.
And this is all just looking at the raw numbers. If the original image is noisy or soft, like what you get on the lesser channels from many cable providers, it's going to look even worse. Start with bad dough, and you're going to get bad cookies. Pardon me, I suddenly seem to want cookies for some reason.
Scaling from other devices
You may have noticed that I've only talked about the TV itself, not upconverting
and receivers. This is because your TV's internal scaler is "fine." It will do all this upconverting automatically. Anything you send it, it shows full screen, fully upconverted.
This is not to say that all scalers are equal. The best scalers can create an image with much greater apparent detail and lower noise than the worst scalers. I've seen incredibly well-upconverted HD images that were near indistinguishable from real 4K content (and so have you, as a fair amount of 4K content is upconverted before it even gets to your TV).
The thing is, the difference between the best scalers in A/V gear and the one in your TV is pretty small (as long as your TV is decent). If you're watching a channel and it looks terrible, chances are the best scaler on Earth isn't going to make it watchable. To go back to the cookie analogy (because apparently I'm hungry), the best scaler might get you 16 cookies, but it's not getting you 30. Will you see that difference on your TV? It's possible, sure, but it is going to be way less obvious than just watching the Blu-ray over the streaming version (or the DVD).
So is it possible a high-end Blu-ray player or receiver will have a better scaler than the one inside than your TV (and potentially create a better image)? Yes, absolutely, but the difference will be slight, and if you're sitting more than 10 feet away, you probably won't be able to see the difference. Also, it might not. There's no blanket "yes" or "no" here.
Which is to say, you don't need an additional 4K upconverter. However...
You do need a 4K source
So far, everything I've said is only if you're trying to watch non-4K content (DVDs, Blu-rays, etc.) Your TV will upconvert these just fine. However, if you want your 4K TV to look its best, you absolutely have to have a 4K source.
Most TVs come with internal apps (Netflix, for example) that will stream 4K content. Alternately, or if you want a bigger selection of content, you need either a media streamer or an Ultra HD Blu-ray player. The best image possible right now is from 4K Blu-ray, which will pretty much always look better than streaming. Some of these use upconverted 2K content, but most of the best-looking have 4K all the way through. Reference Home Theater keeps a current list of the details for each disc, which ones are "true" 4K and which are upconverted by the studios.
But like anything, there is some gray area. Will a well-upconverted Blu-ray look better than 4K streaming? Maybe. Will a 4K Blu-ray that's just an upconverted 2K film look the same as the Blu-ray? Probably not, as the scalers that studios use to process content are way more powerful than what you have in your house. But it's possible it will look the same or similar. It might look a lot better.
In the end, try to send your TV the best signal possible. In descending order of quality, they are: 4K BD, 4K Streaming/HD Blu-ray, HD cable/satellite, DVD, and everything else (though there can be exceptions to that list). Your TV will do an acceptable job upconverting whatever you send it, unless it's a really poor performing budget model.
If you already have a BD player or receiver that can upconvert to 4K and want to see where the best scaler is, go for it. Play with the resolution output on the player/receiver while watching the same content and see if you see a difference. If you can't, don't worry about it. It just means your 4K TV is doing a fine job of scaling.