From 1080p/24 to 4K/120Hz, here's what the numbers really mean.
From the best TVs to game consoles like the Sony PlayStation 5 and Xbox Series X, to cameras like the GoPro Hero 10s and the one on the shiny iPhone 13 Pro, fps matters. But what does fps mean, and why should you care? The short version is that fps stands for "frames per second." In other words, fps is the exact number of individual still images that make up each second of moving video.
Movies, almost exclusively, are 24fps. Live-action TV shows like sports and concerts are typically 30 or 60 fps. Video games vary, but the latest consoles are capable of up to 120fps (as long as you pair them with a TV that can handle it).
But can you even tell the difference? Does fps even matter? Yes, though not always the way you expect. Here's why.
Let's back up slightly and cover some important basics. All video is made up of a series of still images, shown in rapid succession. Show them rapidly enough and your brain is fooled into thinking these still images are actually smooth motion. The speed of that threshold, from series of still images to motion, varies depending on a variety of factors. Generally, though, somewhere in the mid-teens-per-second will appear as "motion" to most observers, where fewer frames per second just look like individual images.
Higher frames per second, also known as frame rates, make the image appear smoother and more realistic. Subjectively, there's a huge jump between 15fps and 30fps. There's less of a noticeable jump between 30 and 60, and even less between 60 and 120. But again, depending on the content, display and other factors, you might notice a difference.
With games, higher frame rates can result in smoother, more natural-looking images. Compare that with low frame rates, where the onscreen action will stutter and pause. Because the graphics processing on all consoles is limited, higher frame rates come at a cost. That cost might be fewer polygons, less detail in objects on screen, more basic lighting, less complex textures and so on. With some games, the console might even reduce the game's resolution and then upconvert it at the output to hit the maximum frame rate. The image won't be as detailed, but the motion will be smoother.
A game designer might decide that their game looks and plays better at 60 or 120fps, despite these limitations, but it depends. Not every game supports higher frame rates, though many new games do (and will). Also, both Sony on the PlayStation 5 and Microsoft with FPS Boost have older titles that now run at higher frame rates.
On the PC side, there are gaming monitors capable of 144Hz (more on Hz vs fps in the next section). There are, potentially, some benefits for competitive multiplayer with higher frame rates, with less and less time spent "waiting" for the computer to update an image. That advantage is, at best, very slight. We're talking milliseconds.
With consoles and PCs, they might not be hitting their maximum theoretical frame rate at all times. If there's a lot going on, a complex boss fight for instance, it might only produce a fraction of that max frame rate -- a feature known as variable refresh rate, or VRR.
With cameras, there's an additional benefit. Recording video at, say 120fps, allows you to play back that video at 60 or 30, greatly slowing down the action. Someone doing a backflip recorded at 120fps and shown at 30 will result in some extremely smooth, epic slo-mo.
The terms fps and Hz are often used to describe the same thing. Hz, or hertz as you may remember it from high school science class, means "one cycle per second." Generally, TVs and related gear use hertz to describe frame rate, a continuation from the analog CRT days. Content like movies and games use frames per second, a continuation from the old film days.
Technically, these aren't always exactly the same. For some esoteric reasons, 30Hz on a TV is usually 29.97Hz -- again, a holdover from CRT.
Effectively, though, consider them the same thing. A TV that's capable of displaying 120Hz content means it wants 120fps content to look its smoothest.
Or to put it simply, fps is the content, Hz is the device. You might see some companies using one term in place of the other, which is fine. In practice, you can use these terms interchangeably.
If you're in the US, Canada or anywhere in the Americas north of Brazil, TVs are 60Hz or a multiple of 60Hz. In the UK, Europe, and most of Asia, Africa, and Oceana, TVs are generally 50Hz or a multiple of 50Hz. Go into an electronics shop in many parts of the world and you might see TVs advertised with 100Hz, something that might catch the eye of a tech-savvy American tourist.
This is entirely due to the frequency of the mains power in each country. It's always one or the other. OK, almost always. Some newer TVs might ignore this by offering you 60Hz in a country with 50Hz power.
Does this matter? Not really. With older TVs it was more likely you could see flicker with 50Hz TVs. I certainly noticed it during visits to the UK in the '90s. In the modern age where nearly every TV is either LCD or OLED, this is only a concern with people especially susceptible to flicker (though the same is true with 60Hz too, for what it's worth).
The race to increase fps doesn't have much of a downside... for the most part. Higher frame rate cameras can record smoother motion, with less motion blur. TVs with higher frame rates themselves produce less motion blur, though potentially with soap opera effect issues. Games with higher frame rates are smoother and can be more realistic, though some other aspect of graphical detail must be reduced on the limited hardware of game consoles.
However, the same isn't true with live-action fictional content, like most TV shows and movies. With decades of conditioning, most people associate the "look" of 24fps content with fiction. Increasing that, something several Hollywood directors have tried, is rife with peril. Yes, a minority of people like it. The majority of people, though, hate it. It completely negates the suspension of disbelief: They're no longer characters in a world, but actors on a set. There's no coming back from that. Thankfully, these experiments in high frame rate, or HFR, have continually been met with derision. Enough so that Hollywood as an industry seems unlikely to adopt something that would so alienate audiences (aka their customers).
And if you do get a TV with 120Hz capability, almost all have the ability to turn off motion smoothing, aka the soap opera effect, so it will look like your old TV. If there's an option for black frame insertion, check that out as an alternative too.
As well as covering TV and other display tech, Geoff does photo tours of cool museums and locations around the world, including nuclear submarines, massive aircraft carriers, medieval castles, airplane graveyards and more.
You can follow his exploits on Instagram and YouTube about his 10,000 mile road trip. He also has written a bestselling sci-fi novel about city-size submarines, along with a sequel.