Update, Nov. 6: Read our reviews of the, and .
Both the only a handful of TVs. This jump in pixel count likely comes as a surprise to many, since 8K TV shows and movies are basically nonexistent. Heck, even 4K games are relatively rare and other 4K video is hard to find beyond movies and original shows on streaming services and Blu-ray discs.and the have . One of the biggest standouts is the possibility of 8K resolution, something available in
While these consoles are technically capable of outputting an 8K signal, that's not the entire story. In reality, the vast majority of games on the new consoles will be 4K at best, and those that claim 8K resolution will rarely be actual 8K. If this sounds confusing, it is.
Don't get me wrong, the PS5 and Series X deliver a huge leap in graphics quality over the current Xbox One X and PS4 Pro, let alone the earlier, lesser versions of those consoles. It's just that 8K resolution will be less important, especially in the beginning, than other new capabilities like and other eye candy like ray tracing.
Here's what you need to know about all those new features.
8K is wildly optimistic
Every new console in the 21st century has had an attention-grabbing, tech-pushing headline feature. With the PS2 and Xbox, it was 1080i. A few years later, the PS3 and Xbox 360 claimed 1080p. After that, the PS4 and Xbox One solidified the 1080p resolution, adding Ultra HD 4K gaming with the PS4 Pro and Xbox One X refreshes. (The One S can output 4K video, but not games.)
The reality is this was often "optimistic" given the consoles' hardware, then largely overblown by the marketing departments. For example, only a handful of games took advantage of the PS2's 1080i or the PS3's 1080p. Only a small percentage of the total number of PS4 and Xbox One games currently take full advantage of the Pro and ability for 4K.
The truth is, as impressive as the newest consoles seem compared to their predecessors, they're still limited by size and price. As such, they can only just barely do what they claim. Even high-end PC video cards, which cost more by themselves than the new consoles will in their entirety, struggle to create 8K video games. This has been the case with every generation of console.
I'm not trying to ignite some banal PC-versus-console argument. What I'm saying is these magical Sony and Microsoft boxes aren't magic, but scaled-back,, roughly equivalent to a with a processor. As such, they have to come up with shortcuts to do what they claim. Which brings us to rendering versus scaling.
Scaling to 8K is a lot easier than rendering in true 8K
There are lots of ways to supply your TV with the millions of pixels it needs to create a picture. If you have a 4K TV, for example, you ideally want to send it true 4K resolution of 3,840x2,160 pixels. A true 4K version of a movie on Netflix or Vudu, or even better, from an, is one way to do this.
Another way is to send your TV a lower-resolution video, like from a traditional HD resolution TV show, Blu-ray or DVD. If you do this,, the video so it has enough pixels to fill your screen. If your TV didn't do this, you'd have a tiny image in the center surrounded by black.
Scaling is essentially what each console generation does to achieve its highest-resolution outputs. The game is rendered, i.e. created or "built," at a lower resolution, and then upconverted before it's sent to your TV. For example, the game is 1080p like a Blu-ray, but then upscaled by the PS4 or Xbox and sent as 4K to your TV. In many cases there'd be no real difference between a "4K" game like this played on a 4K TV, and the 1080p version of the same game played on the same TV. Sony, for instance, calls this "Native vs Adaptive" on the PS4 Pro.
There are always a handful of games in each console generation that are specifically written for, and rendered at, the highest resolution possible by that console. High-resolution rendering comes at a cost, however. There is only so much performance available, and if a game designer is chasing pixels, the game will have to sacrifice some other aspect of graphics quality -- like texture, polygon count, draw distance or frame rate.
Frame rate is the most concrete example of this sacrifice. A game can run at higher resolutions if there are fewer frames created per second. 4K at 60 frames per second requires a lot more processing power than 4K at 30 fps, which is harder than 1080p/60 and so on. Throughout the history of consoles, frame rate has been considered untouchable. Lower image quality was fine, but a choppy, slow frame rate was not.
Faced with limited hardware performance, budget, time, personpower and countless other considerations, game developers might optimize their game for a specific frame rate, usually 30, sometimes 60, so it "runs well" on a console, but settle for a lower resolution than what's theoretically possible.
That's why very few games on the PS5 and Series X will render at 8K, even for 8K TVs. The majority will likely run at 4K/60, and upscale to 8K for the small percentage of people that will have 8K TVs now or throughout the life of the console.
What's better than 8K? Higher and variable frame rates
One of the most impressive aspects of the next generation of consoles is not their ability to create higher resolutions. It's their ability to run games at non-8K resolutions with higher frame rates.
Higher frame rates can create ultrasmooth, lifelike motion. With film-based fictional content that smoothness creates athat many people, including me, . With nonfiction and reality-based programming, like sports, smoother motion looks great. Most video games also strive for graphical realism so the smoother they look, the better.
Current consoles are locked at 60 frames per second (60Hz) since that's the best most TVs can accept. Many TVs can use motion interpolation to smooth out any content, including games, but that's basically the frame rate version of resolution upscaling we talked about in the last section. Remember, just because your TV is , doesn't mean it can accept 120Hz from a source.
Both new consoles will have the ability to create 4K games at 120 frames per second. Again, games will have to be written to take advantage of this higher capability, so don't expect every game to be 4K/120. There are some launch titles with this capability promised. Also, most TVs won't be able to accept 4K/120, even if they are capable of that refresh rate. . Generally this will mean one that has , though not all TVs that have 2.1 will necessarily be able to handle 120Hz.
Both consoles have another new feature called. The idea is the game and the console can tell the TV to change its refresh rate on the fly -- from its native 50, 60, 100 or 120 -- to whatever's required at the moment.
For example, if a PlayStation 5 game has a lot going on and the frame rate starts to slow down, during a massive boss battle for instance, the TV will slow down its rate as well, waiting to refresh the screen until it receives the frame from the console.
Until now the TV's refresh was locked. The console would have to send something no matter what, every 60th of a second or whatever its refresh was. This could result in image tearing, which is exactly what it sounds like, the image looks like someone's torn it across. It's one of the more noticeable image artifacts.
Not only does VRR minimize or eliminate that possibility of that artifact, but allows a bit more of a buffer for game designers. So if their entire game runs smoothly at 60fps except for a handful of intense situations, they can keep them all as-is without risking the image looking like Wolverine clawed it. There's a limit to how slow the TV can go, but in general this will be a big improvement. If you're a PC gamer, you'll have heard of this tech before, Nvidia's G-Sync and ATI's FreeSync.
Again, you'll need a TV that takes advantage of VRR. So we put together a list of the.
Other eye candy
We talk about HDR a lot at CNET, HDR rendering Valve pioneered over 15 years ago. True HDR, with its greater dynamic range for displays that can support it, came along for the 4K ride in the 4 Pro and Xbox X. A growing number of games are taking advantage of this, letting them look even better on HDR TVs. This is likely to expand in future games, as the capability is in both next-gen consoles from launch. With more and more TVs producing better and better HDR, games that can take advantage of that are more than welcome.and . For games it has only recently expanded beyond the
Then there's ray-tracing, something that's still causing many PC video cards to wheeze. Nvidia has a good explanation here. The short version is the potential for significantly more lifelike lighting effects in games. This video is a bit into the weeds, but it does a great job showing what's going on.
Both consoles will have this capability as well, for games written to take advantage of it. Since both consoles are using modified versions of the same graphics chip, it will be interesting to see how they look playing the same games. Probably not a huge difference, but likely some.
Will you need a new TV?
So what will you need to play the latest games on your Xbox Series X or PlayStation 5? Well… nothing. Your current TV will work just fine. If you can connect a PS4 or Xbox One to it, it will work fine with the next-gen consoles. You won't need a new TV or new HDMI cables. If your current TV has HDR, it will play HDR games on the new consoles too. Ray-tracing is something rendered in the game, so that will look good on any TV.
If you want to take advantage of the new features like 8K, 120Hz and VRR,. 8K TVs are expensive now and will likely continue to be so in 2021 and beyond. Far more affordable are 4K TVs that support 120Hz and VRR. Many current midrange and higher-end 2019 and 2020 TVs, including models from LG, Samsung, Vizio and Sony, can handle these new extras. We have a list of all the 2020 models that have .
And yes, you.
Both the PlayStation 5 and Xbox Series X look better than their predecessors. That's obvious. How developers will take advantage of the jump in processing power will determine exactly how they look better. If a developer feels their game looks best being able to generate 8K resolution, that's fine, but another developer might feel their game looks best at 4K/30 with more elaborate textures, better lighting, more polygons, and so on. Both games might look great. It's easy to fall into the trap of tech specs and miss the artistry that goes into creating any game.
Another factor is future-proofing. 8K TVs are expensive today and developers might not want to take advantage of 8K resolution in the first generation of games, but consoles last a long time and TV prices fall fast. Two or three or five years from now that higher resolution might actually be something that matters more, particularly on a gigantic -- think 85-inch -- TV. And by then you might be able to afford one since prices always drop.
The good news is that these games are going to look great on whatever your TV has now, and if you get a new TV, might look even better on that one.
As well as covering TV and other display tech, Geoff does photo tours of cool museums and locations around the world, including nuclear submarines, massive aircraft carriers, medieval castles, airplane graveyards and more.