A TV's Game mode might improve your gaming experience, but it will definitely decrease your picture quality.
Geoffrey Morrison is a writer/photographer about tech and travel for CNET, The New York Times, and other web and print publications. He's also the Editor-at-Large for The Wirecutter. He has written for Sound&Vision magazine, Home Theater magazine, and was the Editor-in-Chief of Home Entertainment magazine. He is NIST and ISF trained, and has a degree in Television/Radio from Ithaca College. His bestselling novel, Undersea, and its sequel, Undersea Atrophia, are available in paperback and digitally on Amazon. He spends most of the year as a digital nomad, living and working while traveling around the world. You can follow his travels at BaldNomad.com and on his YouTube channel.
If you've bought a TV in the last few years, it's likely that TV has a "Game mode." As you've discovered (or will discover, now that you're looking for it), this is not some supersecret hiding place for Angry Birds. That's a different section.
If you're a gamer, Game mode might improve your gaming experience, but it comes at a cost.
Input lag is not your friend
The problem is lag. Lag, in the colloquial gaming sense, is the difference in time between your brain deciding on an action and that action happening on your TV screen. It goes something like this: your brain registers an enemy on the screen and it sends electrical impulses to your fingers. Your fingers press a button. The controller turns this button press back into electrical signals to send to the PC/console. The PC/console then does whatever action you instructed, let's say pulling an onscreen trigger. That gets sent to the TV, and you see the result.
If you're playing the game online, there's additional lag between your PC/console and the central gaming server (and then the other player's PC/console). For this discussion, let's just talk about input lag, not network lag, which is an entirely different phenomenon that unfortunately shares the same name (not to mention all the noob PC gamers that call poor frame rates "lag").
Since timing is everything in a fast-paced video game, you want as little lag as possible between your brain and the action on the screen. The time it takes your brain to think an action, and that action to get to the PC/console, is infinitesimal compared with all the next steps. What the game does with the information can vary, but let's say, in a perfect world, that it, too, is "instant." Even if it's not, there's nothing you can do about it.
So that leaves the TV. Modern televisions are processing powerhouses. There's a lot going on inside the modern TV for it to do what it needs to do. It needs to receive and decode the incoming video signal (likely from HDMI, but possibly analog), then it needs to convert this signal to something useful, like deinterlacing or scaling. But then it goes even further, needing to convert this into whatever "language" the display uses to make its image. Plasmas and LCDs create images in very different ways, and neither is as simple as a scanning electron beam (not that that was particularly simple, either).
The problem is, all those processing steps take time. Time you don't have. Milliseconds count in a twitch-based game, and a few milliseconds for deinterlacing or scaling, a few for color processing, a few for the TV to figure out what to do with all these bits...and NOPE, too late, you have been eaten by a grue.
Another, more serious, consequence of slow processing and "lag" is lip sync errors. You've all seen this, where you hear the voices but the lips don't match exactly. Sort of like a bad English dub of a foreign language movie. This is far more distracting, and once you notice it, you can't not notice it.
As processing power, or more precisely, cheap processing power has increased, input lag has generally decreased. There are exceptions, though, where a TV manufacturer has cut costs and gone with cheaper processors, which take longer to sort through the mounds of data that is HD television.
Enter the Game (mode)
For many years, knowing gamers would be bothered by bad input lag, companies started adding a Game mode that would reduce input lag and allow your in-game performance to be tied closer to your personal skills and reflexes (for better or worse).
However, Game mode isn't magic, nor does it simply overclock the processors and jack up the performance. Instead, it starts taking things out. Color processing, noise reduction, advanced scaling, all of these aspects and more get thrown out or greatly reduced. As such, the image gets a lot worse. Scaling artifacts are much more likely. Color accuracy and even potentially color temperature tracking, all suffer. The image can get noisier as well. What specifically gets tossed varies per company, but the end result is the same: less input lag, worse image.
One of the most common offenders of bad input lag is high frame rate LCDs (120 and 240Hz). The processing needed to fill these higher frame rates is intensive, and one of the first things to go in Game mode. However, ditching these higher frame rates means motion blur is greatly increased, so everything that moves, or if your avatar looks around, the image blurs. This can be almost as bad as the input lag in as much as it prevents you from accurately seeing your enemies.
Lastly, Game mode is not a guarantee of no input lag, just a hope for less. One TV I reviewed this year had seriously bad input lag, and while Game mode improved it, it was still significantly worse than other TVs.
Got a question for Geoff? Send him an e-mail! If it's witty, amusing, and/or a good question, you may just see it in a post just like this one. No, he won't tell you which TV to buy. Yes, he'll probably truncate and/or clean up your e-mail. You can also send him a message on Twitter: @TechWriterGeoff.