X
CNET logo Why You Can Trust CNET

Our expert, award-winning staff selects the products we cover and rigorously researches and tests our top picks. If you buy through our links, we may get a commission. Reviews ethics statement

How to buy a gaming monitor

Here are the basics you need to start your search.

Lori Grunin Senior Editor / Advice
I've been reviewing hardware and software, devising testing methodology and handed out buying advice for what seems like forever; I'm currently absorbed by computers and gaming hardware, but previously spent many years concentrating on cameras. I've also volunteered with a cat rescue for over 15 years doing adoptions, designing marketing materials, managing volunteers and, of course, photographing cats.
Expertise Photography, PCs and laptops, gaming and gaming accessories
Lori Grunin
15 min read
Josh Goldman/CNET

Do I need 24, 27, 34 inches or more? Full HD, 1440p or 4K? Curved or flat? Does HDMI 2.1 matter? HDR? If you've just parachuted into the gaming monitor battlefield without a map, these are just some of the questions you might have. Hopefully, this guide should help you orient yourself to the environment. 

Gaming monitor shopping is far more complicated than for other types of displays, at least if you're like most of us and subject to budget constraints. That's because you have to factor in the type of games you play and the capabilities of your GPU when calculating the tradeoffs to save money.

The TL;DR

If you don't want to spend a lot of time thinking about it, my quick-and-dirty recommendation is a 27-inch flat-screen IPS display with 1440p (quad HD) resolution and a refresh rate of 144Hz or better and DisplayHDR 600 (or the equivalent). You can usually find quite a few choices in the $250 to $500 price range. If you need to go cheaper or smaller, drop to a 24-inch 1080p model (aka full HD) with a 144Hz or faster refresh rate; you can find those for $150 to $250. If you want a really good monitor -- 32 inches or bigger with 4K-plus resolution at refresh rates starting at 120Hz and HDR with 1,000 nits or more brightness -- generally expect to spend upward of $1,000. The same frequently goes for cutting edge technologies, such as QD-OLED (though we don't yet know how much Alienware's 34-inch model will cost.)

To me, 24 inches feels small, especially if the monitor is serving time as a work display during the day or if you play games with expansive worlds. But either should be able to handle most types of games. If you want to connect to both a console and a PC, almost any recent monitor will work, but some are optimized for the task in big sizes -- currently 42 inches or bigger -- with an explicit list of the HDMI 2.1 features you care about, such as dynamic HDR metadata (if you want HDR) and variable refresh rate. They'll cost well over $1,000, too.

To save money, at least in the short run, don't overbuy. If you've got a 3-year-old system with a GPU that gets you 90 frames per second in 1440p on your most-played games and you don't plan to upgrade in any meaningful way in the near term, you can save money by not going for the 240Hz model.

Read more: The Best Monitors According to the CNET Staff Who Use Them

Quick recommendations

  • Within the constraints of your budget and desk space, get the largest monitor you can for single-monitor setups. You'll rarely regret buying a monitor that's too big, but you'll frequently regret buying one that's too small. There are also super-widescreen monitors with 21:9 aspect ratio (sometimes listed as 2.35:1), many of which are 34-inch displays with lower-than-4K resolution. 
  • Factor in the aspect ratios your favorite games support. If they only offer 16:9 options, configuring them for a widescreen 21:9, 24:10 or 32:9 monitor can be annoying and frustrating; you may also be able to save some money.
  • Recommended minimum resolutions for gaming: 24 inches, 1,920x1,080 pixels; 27 to 32 inches, 2,560x1,440 pixels; 34 to 39 inches 3,440x1,440 pixels; 43 to 49 inches, 3,840x2,160 pixels. Note that almost all 49-inch monitors have 3,840x1,080 resolution, which is lower than my recommended threshold. For the rationale behind these suggestions, read the section where I discuss pixel density.
  • Make sure the stand can adjust to the appropriate height for you to use comfortably and tilt to a usable angle. Depending on your needs, you may also want a stand that can swivel or allow the screen to rotate 90 degrees for use in portrait orientation. This may be useful if you want to combine multiple monitors to get a vertical height greater than what you'd be able to get in a single monitor. 
  • Go with one that you find attractive because you'll be staring at it a lot. You may also want to think about models that support lighting coordination across devices, such as Razer's Chroma or Asus' Aura Sync. You also want a stand that looks good and that has sensible cable management, allowing you to feed wires through a hole or channel to keep them together. Cable management can be important if the monitor has a USB hub, since you'll want to keep those cables under control as well.
  • To use with an Xbox Series X or PlayStation 5 console (when that firmware update finally appears), you'll want a monitor with an HDMI 2.1 port that explicitly states it supports 4K at 120Hz with variable refresh (also called VRR) support. 
  • If you're putting together a multimonitor setup, look for thin bezels. A matched set of curved displays may also work better than flat screens. On the other hand, if you play different types of games, such as shooters and sims, you may want to get a pair of monitors optimized for each type of game, such as a fast-refresh QHD 27-inch display for battling and a medium-fast-refresh 32-inch 4K model with a large color gamut for building. I also recommend two 27-inch displays over a single 49-inch.
  • You can save money by sticking to an appropriate refresh rate. In other words, if your GPU rarely hits more than 90fps during gameplay, you probably don't need to drop the extra dough for 165Hz if you can't afford it. If you do have the cash, however, a higher refresh rate may be worth it if there's a GPU upgrade in your future.
  • A gray-to-gray pixel response time of 5ms or less is good for gaming; 1ms is best for fast action on high-refresh-rate screens.
  • For a lot of games, HDR doesn't matter, because they don't have lots of areas with high brightness or deep shadows, or don't take advantage of the bigger tonal range in any meaningful way. But you'll probably get better visuals for AAA games, more creeps from horror games, fewer ambushes out of the shadows in FPS games and so on if they do support HDR. If you do opt for it, anything less than HDR 600 isn't very noticeable, and that level gives a good balance between price and noticeable boost. For more detail check out the story How to choose an HDR gaming monitor.
  • The bigger the color gamut the screen covers the better. At a minimum, you want 100% sRGB, but 90% or higher P3 (also known as DCI-P3) is best, as it delivers more colors. Look for specific gamut coverage percentages rather than terms like "1 billion colors," which are essentially meaningless.
  • Contrast matters -- higher is better -- but the contrast spec provided by manufacturers is almost worthless. Anything above 1,000:1 should be OK. I prefer 1,400:1 or better, however, provided it's a measured result. Contrast is the ratio of black to white on a given screen, and darker blacks produce better perceived contrast, so the same contrast ratio can sometimes look a little better on a dim monitor than a bright monitor. But there are other factors which affect it.

Want more background? Here you go.

What screen size should I get?

Everything being equal, and if you've got the space and budget, bigger is almost always better. Screen size labeling is based on the length of the diagonal: That made it easy to compare monitor sizes when almost every screen had the same aspect ratio -- essentially the proportions of the screen rectangle, which is the ratio of horizontal to vertical pixels. But wide and ultrawide screens on desktop and newer ratios on laptops (such as 3:2 or 16:10) make cross-size comparisons a little more difficult.

Read more: Best gaming laptops

If you remember your geometry and algebra, you can calculate the width and height of the display if you also know the aspect ratio. (Because width/height = aspect ratio and width² + height² = diagonal²!) The further from 1:1 the aspect ratio is, the wider the screen and more of it will be out to the sides -- and therefore in your peripheral vision if you're sitting close. It will also let you figure out the physical dimensions of the screen, most notably the width, to ensure it will fit in the allotted space. 

DPI Calculator can do the math for you, but keep in mind that the numbers only represent the panel size, not the size of the display, which includes bezels and the mount. Nor does it take into account curved displays, which tend to have smaller horizontal dimensions than their flat-screen equivalent. 

alienware-55-oled-aw5520q-9

The 55-inch OLED displays look great but don't work well as desktop monitors, so you might as well use a cheaper TV that size with your console.

Lori Grunin/CNET

Can I use a TV instead?

You can certainly drive a TV from your computer, but TVs are meant to be viewed from a distance, while computer displays are designed for closer work. As TVs get smarter about gaming and consoles share space with PCs and laptops, however, the gap between the two is narrowing. So for gamers, having a primary computer display for working and a TV hooked up for gaming may make sense, at least if it's not too big. Want to do that? Here's how to use your 4K TV as a monitor.

Read more: Best gaming TV: Low input lag and high picture quality

If you want an OLED screen, a TV is still your best bet though. We've seen a couple of 55-inch OLED monitors like the Alienware 55, but now that TVs have improved game support you're probably better off than overpaying for a monitor. Smaller OLED monitors are trickling into the market, but still not at the desk-friendliest sizes. 

xbox-dolby-vision-from-gaming-menu-screen.png

Dolby Vision for Gaming on the Xbox Series X and S.

Screen capture by Lori Grunin/CNET

We're starting to see some monitors targeted toward console gamers, but Dolby threw a small spanner into the works for those by announcing Dolby Vision support for the Xbox Series X and S. But no gaming displays, including models like the Asus ROG Strix XG43UQ or Gigabyte AFV43U, support Dolby Vision yet. Only professional content-creation monitors like the Asus ProArt PA27UCX-K or Apple Pro Display XDR currently support it, and they only support 60Hz refresh rates and don't have the essential HDMI 2.1 features. 

That doesn't mean you should discount monitors, though. Not a lot of games support DV either at the moment: just 10.

4K, 1440p, 1080p or...?

Resolution, the number of vertical by horizontal pixels that comprise the image, is inextricable from screen size when you're choosing a monitor. What you really want to optimize is pixel density, the number of pixels per inch the screen can display, because that's what primarily determines how sharp the screen looks as well as how big elements of the interface, such as icons and text, can appear. If you're gaming with a controller at distances further than you'd be sitting at a desk, it can be critical. 

For instance, I've discovered that I can't read the text well enough to even make it through a tutorial in 1440p on a 32-inch monitor from more than about 4 feet away.

Common screen resolutions

StandardResolutionAspect ratio
Full HD (FHD) 1,920x1,08016:9
Wide quad HD (WQHD) 2,560x1,44016:9
Wide quad XGA 2,560x1,60016:10
Ultra wide quad HD 3,440x1,44021:9
Ultra HD 4K (UHD) 3,840x2,16016:9
Digital Cinema Initiatives 4K (DCI 4K) 4,096x2,160Between 16:8 and 16:9
5K 5,120x2,88016:9

Standard resolutions include 4K UHD (3,840x2,160 pixels), QHD (Quad HD, 2,560x1,440) and FHD (Full HD, 1,920x1,080): You're better off looking at the numbers than the alphabet soup, because when you get to variations like UWQHD they can get mind-bogglingly ambiguous. When you see references to "1080p" or "1440p," it's shorthand for the vertical resolution. But I've yet to see someone refer to 2,560x1,600 pixels, a popular new 16:10 laptop-screen resolution, as 1600p.

For example, on a 27-inch display, 1,920x1,080 has a pixel density of 81.59 ppi. On a 24-inch display, it's 91.79 ppi. Because a higher density is better (up to a point), FHD will look better on the smaller screen. This also depends on your vision: For me, too low a resolution and I can see the pixel grid and at slightly better than that I see nothing but jaggies on small serif type. So "optimal" really depends on what you're looking at and personal preference. My preference for working, highly detailed sims, games with a lot of text and so on is at least 100ppi; if you're moving so fast there's no time to stop and shoot the flowers, you can probably drop to as low as 90ppi. Once again, DPI Calculator can do the math for you. (A related spec is dot pitch, the size of the space between the center of the pixels, which is just the inverse of pixel density. For that, smaller is better.)

Common pixel densities (pixels per inch)

Resolution24-inch27-inch32-inch
1,920x1,080 91.881.5968.8
2,560x1440 122.4108.891.8
3,440x1,440 155.4138.1116.5
3,840x2,160 183.6163.2137.7

What is HDR and why do I want it?

In short, high dynamic range refers to scenes rendered with brighter highlights, greater shadow detail and a wider range of color, for a better-looking image. For gaming HDR, in contrast to TV HDR, it means more than just a prettier picture: the better you can see what's lurking in the bright and dark areas, the more likely you are to avoid danger and spot clues.  

It used to require that games explicitly supported HDR as well, but the introduction of Auto HDR in the Xbox Series X/S and forthcoming in Windows 11 changes that: The operating systems can automatically expand the brightness and color ranges of non-HDR games. It's not the same as having a game that was rendered to use the expanded ranges, but it can give it a bump to make it look better than it otherwise would. 

At CES 2022, the organization behind the HDR10 standard announced the forthcoming HDR10 Plus Gaming standard, a variation of the HDR10 Plus that's been available on TVs for a while. It adds Source Side Tone Mapping (SSTM), which adjusts the brightness range on a scene level based on data embedded by the game developer -- HDR10 has a single range that has to work for the whole game. It also includes the ability to automatically trigger a display's low latency mode, to compensate for the additional overhead imposed by the HDR data (more important for TVs than monitors), as well as support for variable refresh rates in 4K at 120Hz on consoles (still not implemented in the PS5 as of today).  

Why do I need to worry about HDMI details

Unfortunately, the HDMI specification has turned into such a mess that you can't make any assumptions about capabilities based on the version number, not only is every HDMI 2.0 connection henceforward to be labeled as 2.1a (with the same HDMI 2.0 feature set), but the specification no longer mandates any of the important new features; in other words, all the whizzy capabilities that made HDMI 2.1 desirable, especially as a choice for consoles, are now optional. 

Bottom line: If you want a monitor for your console that can do 4K at 120Hz, support variable rate refresh and auto low-latency mode, you'll have to verify support for each individually. And the same goes if you want a PC monitor connected via HDMI that can support source-based tone mapping (discussed subsequently) and bandwidth-intensive combinations of high resolution, fast refresh rates and high color depth/HDR.

Monitor manufacturers are supposed to list supported features explicitly; if they don't, either pass the monitor by or delve deeper. If you want the gory details, TFT Central does an excellent job explaining the issues.

Is curved or flat better?

To me, curved monitors are the best way to make a single display wider without forcing you to sit too far back; that's why they make more sense for a desktop monitor than for a TV. Optimally, you should be able to see the entire screen without moving your head too much. Once you get beyond roughly 27 inches, you'll need a curve if you're sitting at a desk. Don't get me started on the "immersive experience" of curved screens: Unless that display wraps all the way around me, it's no more immersive than any other.

If you're buying a screen that's 27 inches or below, aside from the fact that curved displays can look ever so much prettier, one of the few practical applications for it is three-monitor gaming setups, which let you create a better widescreen experience. Otherwise, small curved screens just aren't worth it, especially if you're paying extra for the privilege. In fact, I feel like curves on smaller screens bring the edges too far into my peripheral vision for comfort.

odyssey-ark-above

At CES 2022, Samsung announced its Odyssey Ark monitor, which brings a new level of elegance to gaming.

Samsung

The amount of curve is expressed in "R," the radius of its arc in millimeters. For a given display size, bigger numbers are tighter arcs, so 1,800R (the radius of many 27-inch curved displays) is shallower than 2,000R. Too much of a curve can be distracting, while too little may as well be flat. However, ignore all the talk of how "immersive" they are. They really aren't yet, at the very least because many games still aren't able to take full advantage of the nonstandard aspect ratios. On the other hand, unlike curved TVs, you'll always be sitting in the sweet spot, so glare shouldn't be an issue.

Many widescreen models tend to have a 21:9 aspect ratio, which means they're wider and shorter than other displays and full-screen video will be pillarboxed. But larger monitors without a curve at a more common 16:9 aspect ratio would require you to be bobbleheaded because they'd be quite tall: 24 inches (61 cm) high for a 49-inch monitor versus 19 inches (48 cm).

Should I get two (or more) small screens or one ultrawide?

This depends on what you're doing. For instance, if you want a fast gaming monitor for play and a high-resolution display for work, it's a lot cheaper to get two than a single one that does both. Or if you need a color-accurate monitor for design but want a high-brightness one for gaming, it's also a lot cheaper to get two smaller ones. But if you just need a ton of screen space, a single ultrawide might be simpler.

Does the screen technology -- IPS, TN and more -- matter?

Sort of. For current monitors at all but the lowest, cheapest end, your choices are between VA (vertical alignment) and IPS (in-plane switching). Some manufacturers refer to their panels as "high-speed" IPS, but that's just to distance it from the old perception that IPS has slow pixel response. The reason you generally don't need to think about the technology is because other specs, such as the ones that follow, provide more meaningful decision options than the panel type.

What refresh rate is good? What about GtG?

Refresh rate is the number of times per second (in Hertz, or Hz) the screen can update, and can produce unwanted artifacts such as blur, tearing and stuttering which occur when there's a difference between the rate at which the graphics card is feeding the display and the rate at which the screen updates. 

Pixel response, also known as Motion Picture Response Time or Gray-to-Gray time (though those two aren't the same thing), is how fast an individual pixel can switch states from black to white or from gray to gray (the more commonly provided spec). It's measured in milliseconds. Faster is better, and you generally want a maximum of 5ms or less GtG for all but esports-level gaming. Monitors will sometimes offer a branded motion blur-reduction mode, which performs some technological sleight of pixel to reduce perceived blur. Your mileage may vary with these.

Refresh rate and pixel response time are inextricable from each other: a display with a fast refresh rate will have a fast pixel response unless something is very wrong. Both specs are sometimes provided in an overclocked mode.

The current "stratospheric" refresh rates are 300Hz or 360Hz, which are primarily intended for esports and come on small screens (less than 27 inches) and only in panels (TN) that don't display a wide range of colors and look terrible off-angle. Most gamers should be fine with 120Hz to 240Hz.

 You can find everything you've ever wanted to know about the subject and more at Blur Busters.

Do I need Nvidia G-Sync or AMD FreeSync?

There is a spectrum of technologies designed to compensate for the disconnect between screen update rate and gameplay frame rate, which fall under the umbrella of variable refresh rate. The disconnect can cause artifacts like tearing (where it looks like parts of different screens are mixed together), stutter (where the screen updates at perceptibly irregular intervals) and more.

At the most basic, your monitor should support generic VRR. That will enable games to use their own methods for syncing the two rates, which on the PC frequently means the game just caps the frame rate it will allow. One step up from that is generic adaptive refresh rate, which uses extended system-level technologies to vary the screen update rate based on the frame rate coming out of the game. This can deliver a better result than plain VRR, as long as your frame rates aren't all over the place within a short span of time.

Beyond that, you'll see VRR technologies from Nvidia and AMD branded under G-Sync and FreeSync, respectively, each of which come in multiple levels of complexity. If you're serious about gaming, you might want to consider waiting for monitors that support Nvidia's G-Sync Esports to ship. They'll be 27-inch 1440p models incorporating Nvidia's new sync standard as well as Nvidia Reflex for minimizing latency throughout the click-to-screen response