X

Should you get a 4K TV for gaming?

Gaming is one of the most oft-referenced reasons to get a 4K TV. But is it worth it? We weigh the pros and cons.

Geoffrey Morrison Contributor
Geoffrey Morrison is a writer/photographer about tech and travel for CNET, The New York Times, and other web and print publications. He's also the Editor-at-Large for The Wirecutter. He has written for Sound&Vision magazine, Home Theater magazine, and was the Editor-in-Chief of Home Entertainment magazine. He is NIST and ISF trained, and has a degree in Television/Radio from Ithaca College. His bestselling novel, Undersea, and its sequel, Undersea Atrophia, are available in paperback and digitally on Amazon. He spends most of the year as a digital nomad, living and working while traveling around the world. You can follow his travels at BaldNomad.com and on his YouTube channel.
Geoffrey Morrison
6 min read
Sarah Tew/CNET

Today there are plenty of TVs for sale with 4K resolution, the successor to high-definition TV, but almost no actual 4K content. With the absence of pretty much anything else to watch in 4K today, gaming is one of the most commonly mentioned uses for 4K TVs.

"Sure," they say. "There's no 4K movie or TV shows yet, but think of the games!"

Not so fast. Playing a game using four times as many pixels isn't the clear-cut win it may seem at first glance. There are a lot of issues with gaming in 4K as it stands today, and for a lot of people, it's definitely not a good reason to get a 4K TV.

Here are the pros and cons of getting a 4K TV for gaming.

Do you even have 4K games?

If you are a console gamer as opposed to a PC one, then no, you do not. Neither the PlayStation 4 nor the Xbox One are capable of 4K gaming. Even if that support is somehow added, the limitations of the hardware means it will be upconverted to 1080p, not true 4K. Upconversion is how the PS2 did 1080i and the PS3 and 360 did 1080p (with very few exceptions).

So the only way to get games at 4K right now is with a PC.

Neither the Xbox One nor the PS4 allow 4K gaming. Sarah Tew/CNET

Powerful PC required

Increasing resolution to 4K puts a huge strain on a PC graphics card. While you may be able to run it with your current rig, you might have to reduce all the image-quality settings. A game you could play with all of the graphics options maxed out at 1080p might only be playable with medium or low settings at 4K. As a result, it might actually look better at the lower resolution.

CNET recently tested a couple of extremely high-end PCs running multiple Nvidia Titan X graphics cards, a $8,800 Maingear Shift and a $7,800 Origin Millennium, and both achieved nearly triple the frame rates at 1080p resolution compared to 4K. You don't have to spend nearly that much to game in 4K -- the $2300 Asus ROG G751 laptop can do decent 4K framerates, for example, and you can spend even less by configuring a custom desktop -- but the test does provide an idea of how much more power all those extra pixels require.

Sarah Tew / CNET

It's also worth noting that, even with optimal settings, not every game is going to look that much better in 4K. If the game wasn't made to run at 4K, it might look a little better, but lacking 4K textures, it's not going to be a night-and-day change. When I ran Battlefield 4 in 4K when the first round of TVs came out, the edges of things looked crisper (and with less jaggies), but it was only a slight improvement.

This isn't to say that eventually games will look incredible in 4K. They will, but we're several years from that being commonplace.

Sarah Tew/CNET

Then there's the TV...

Not just any 4K TV will do, either. Most of the early 4K TVs, the ones you might be able to pick up ultracheap, used the older HDMI 1.4 connection standard. HDMI 1.4 maxed out at 3,840x2,160 at 30 frames per second. That's not too bad, but probably not as smooth as you're used to. Nearly all console games were designed to run at 60fps, for example, even at the expense of resolution and other factors.

Nearly all 4K TVs this year are HDMI 2.0, which ups the max 4K framerate to 60. True, you might try to pick up one of the very, very few 4K TVs with DisplayPort, but then you're buying a TV based solely on its inputs, and that's probably not the best idea. (Shouldn't picture quality be first?)

For that matter, does your current video card have HDMI 2.0? Probably not (few do, at the moment), but getting a new video card is likely to happen a lot sooner than replacing your TV.

Input lag

If you're a medium-to-hardcore gamer, you need to care about input lag. It has a huge effect on your scores, and if you're like me, your enjoyment of a game.

Input lag is how long it takes for the TV to generate an image, adding time to when you press a button, and when that action occurs on screen. When you see something on screen, an enemy appearing around a corner for instance, input lag increases the amount of time before you can react to it.

Personally, switching from a low-input-lag display (the Sony HW40ES projector) to a high-input-lag display ( JVC DLA-X35 ) meant that on Battlefield 4 and Hardline I went from mid- and upper-mid pack for scores per round (I ain't as good as I once was...), to the bottom 30 percent. Tripling the lag did nothing for my diminishing skills.

Some 4K TVs have pretty good input lag (40ms range), but few are as fast as the fastest 1080p TVs (or many computer monitors). In the end, you may have a more limited selection of brands or models if you want 4K and low input lag.

Watch this: Is it worth paying extra for 4K?

Sitting closer

One of the main arguments for 4K gaming is the ability to sit closer, still have a detailed image, and get a massive field of view. Potentially, if you sit close enough, the screen fills your peripheral vision so much that only a VR headset like Oculus could do better.

Gaming from close-up is not without its drawbacks. I sit 9 feet from a 102-inch screen. It's awesome, I'm not going to lie. However, for gaming, there's a big issue you might not have thought about: you can't see the whole screen at once. You can see it, sure, but the corners are in your peripheral vision. With most games there's important information in the corners, like a health bar, mini map or ammunition counter, that you have to look down, over, up, (essentially "away") to see.

It's like driving and having to take your eyes off the road to see what station the radio is on. Except in this case, it may mean not seeing when an enemy shows up on your radar, and he casually walks up behind you and knifes you in the back. This happens to me a lot.

So the added resolution is potentially great for games with lots of stuff on screen, but if you're sitting close enough to see all the tiny details, you might also be so close that you're missing stuff happening on the edges of the screen.

Not a deal-breaker, for sure. Hell, I still do it, because I really like the immersive feel of a huge image. But it's something to keep in mind.

Bottom line

I hate writing this type of article because I know a lot of people think gaming in 4K is going to be some revelation. It's an improvement as long as you have a powerful PC, but not as huge a one as you'd expect -- not yet anyway. As more games are designed with 4K textures, then it will definitely become more amazing, as long as you sit close enough to discern the extra resolution.


Got a question for Geoff? First, check out all the other articles he's written on topics such as why all HDMI cables are the same, LED LCD vs. OLED vs. Plasma, why 4K TVs aren't worth it and more. Still have a question? Send him an email! He won't tell you what TV to buy, but he might use your letter in a future article. You can also send him a message on Twitter @TechWriterGeoff or Google+.