How to buy a monitor for gaming or working from home

Here are the basics you need to start your search for a new monitor for work or play.

Lori Grunin Senior Editor / Advice
I've been reviewing hardware and software, devising testing methodology and handed out buying advice for what seems like forever; I'm currently absorbed by computers and gaming hardware, but previously spent many years concentrating on cameras. I've also volunteered with a cat rescue for over 15 years doing adoptions, designing marketing materials, managing volunteers and, of course, photographing cats.
Expertise Photography, PCs and laptops, gaming and gaming accessories
Lori Grunin
14 min read
Sarah Tew/CNET

Should I go with 24 or 27 inches? Full HD or 4K? If questions like that are spinning around your head, chances are you've just begun to search for a new monitor to make your work-at-home (or play-at-home) setup more productive. We'll try to slow down your spinning head with this guide.


If you're just looking for a generic display for working or schooling at home and don't want to hurt your brain thinking about it too much, for adults I recommend a 27-inch flat-screen display with 4K resolution, and one that uses an IPS panel. That should run about $500. If you need to go cheaper, drop to a 24-inch model with 1,920x1,080-pixel aka full HD resolution, which you can get for less than $150; 22 or 24 inches is a good choice for kids, too, or if you need something for a small space, but honestly, it's on the small side. 

Unless you're a hard-core gamer or creative professional, many of the most technical specs -- color gamut and latency, for example -- won't really matter to you (and you should always take manufacturer specs with a grain of salt, anyway). 

When hooking up to a laptop, you need to make sure that you've got the right connections: some USB-C or USB-C/Thunderbolt 3 ports support a feature called alt-display mode, which means you can use a USB-C to HDMI or USB-C to DisplayPort cable (or adapter) to connect to a monitor with those connections. Older laptops may still have the native connectors like HDMI or DisplayPort. 

Read more: Best monitors under $200 you can get right now  

Got a Mac? If it's an old MacBook and has an HDMI port, or an iMac or Mac Mini, you won't have a problem. More modern MacBooks with USB-C/Thunderbolt 3 connections will require an adapter or cable with conversion built-in. You may also need to fiddle with the resolution and scaling settings in Mac OS, since it natively prefers a 16:10 aspect ratio, not the 16:9 aspect ratio that's much more popular on Windows.  

If you want to go a little more in-depth, here are some rules of thumb to follow:

  • Within the constraints of your budget and desk space, get the largest monitor you can. You'll rarely regret buying a monitor that's too big, but you'll frequently regret buying one that's too small. There are also super-widescreen monitors with 21:9 aspect ratio (also known as 2.35:1). Many of these models are curved, and most of them are 34-inch displays with lower-than-4K resolution. It's mostly a specialty item for gamers. 
  • If you can afford it, go 4K. If not, choose one with a 16:9 aspect ratio, which is most commonly 1,920x1,080 (also called FHD, or Full HD). You can find the aspect ratio by dividing the horizontal resolution by the vertical resolution, and the result for 16:9 should be 1.77:1. 
  • Make sure the stand can adjust to the appropriate height for you to use comfortably and tilt to a usable angle. Depending on your needs, you may also want a stand that can swivel or allow the screen to rotate 90 degrees for use in portrait orientation. Portrait will let you see more of a vertically scrolling web page, for instance, or be a little more comfortable if you work with print layouts.
  • Go with one that you find attractive -- you'll be staring at it a lot. For many people that's synonymous with "thin bezels." You also want a stand that looks good and that has sensible cable management, allowing you to feed wires through a hole or channel to keep them together. Cable management can be important if the monitor has a USB hub, since you'll want to keep those cables under control as well.

What screen size do I need?

Everything being equal, and if you've got the space and budget, bigger is almost always better. Screen size labeling is based on the length of the diagonal: That made it easy to compare when almost every screen had the same aspect ratio (the ratio of the number of horizontal pixels to vertical pixels) but wide and ultrawide screens on desktop and newer ratios on laptops (such as 3:2 or 16:10) make it a little more difficult.

If you remember your geometry and algebra, you can calculate the width and height of the display if you also know the aspect ratio. (Because width/height = aspect ratio and width² + height² = diagonal²!) The further from 1:1 the aspect ratio is the wider the screen and more of it will be out to the sides, and therefore in your peripheral vision if you're  close. It will also let you figure out the physical dimensions of the screen, most notably the width, to ensure it will fit in the allotted space. DPI Calculator can do the math for you.  

What resolution -- 4K, FHD or ...?

Resolution, the number of vertical x horizontal pixels that comprise the image, is inextricable from screen size when you're choosing a monitor. What you really want to optimize is pixel density, the number of pixels per inch the screen can display, because that's what determines how sharp the screen looks (though there are some other factors), as well as how big elements of the interface, such as icons and text, can appear. Standard resolutions include 4K UHD (3,840x2,160 pixels), QHD (Quad HD, 2,560x1,440) and FHD (Full HD, 1,920x1,080): You're better off looking at the numbers than the alphabet soup, because when you get to variations like UWQHD they can get ambiguous. When you see references to 1080p or 1440p, it's referring to the vertical resolution.


Though 1080p on a 27-inch monitor is a stretch, 

Lori Grunin/CNET

For example, on a 27-inch display, 1,920x1,080 has a pixel density of 81.59 ppi. On a 24-inch display, it's 91.79 ppi. Because a higher density is better up to a point, FHD will look better on the smaller screen. This also depends on your vision: For me, too low a resolution and I can see the pixel grid and at slightly better than that I see nothing but jaggies on small serif type. So "optimal" really depends on what you're looking at and personal preference. My preference is at least 100ppi. Once again, DPI Calculator can do the math for you. (A related spec to pixel density is dot pitch, a measure of the spaces between the pixels. For that, smaller is better.)

But another important consideration when figuring out what resolution to get relative to screen size is scaling. On a 27-inch screen, the operating system (both Windows and Mac OS) can scale interface elements to be larger, but never smaller. For a given screen size and pixel density, 100% scale is bigger for lower densities. The bottom line is you can frequently scale high-density screens to make the elements bigger, but you can never scale low-density screens to make them smaller. In other words, if you're buying a bigger monitor thinking you'll be able to fit more on the screen, you can't. Even in applications that let you zoom independently, like Chrome, you quickly lose readability when you view at less than 100% at low pixel densities.

Common resolutions

StandardResolutionAspect ratio
Full HD (FHD) 1,920x1,08016:9
Wide quad HD (WQHD) 2,560x144016:9
Wide quad XGA 2,560x1,60016:10
Ultra wide quad HD 3,440x1,44021:9
Ultra HD 4K (UHD) 3,840x2,16016:9
Digital Cinema Initiatives 4K (DCI 4K) 4,096x2,160Between 16:8 and 16:9
5K 5,120x2,88016:9

Do I want curved or flat?

To me, curved monitors are the best way to make a single display wider without forcing you to sit too far back; that's why they make more sense for a desktop monitor than for a TV. Optimally, you should be able to see the entire screen without moving your head too much. Once you get beyond roughly 27 inches, that requires a curve if you're sitting at a desk. Don't get me started on the "immersive experiences," of curved screens: Unless that display wraps all the way around me, it's no more immersive than any other.

At 27 inches and below, aside from the fact that curved displays can look ever so much prettier, one of the few practical applications for it is three-monitor setups, which let you create a better widescreen experience. Otherwise, small curved screens just aren't worth it, especially if you're paying extra for the privilege. And in fact, I feel like curves on smaller screens bring the edges too far into my peripheral vision for comfort.


Curved displays are all the rage now. The larger ones are wide but not very tall and the 21:9 aspect ratio (at least on the bulk of the 34-inch models) means video gets pillarboxed.

Josh Miller/CNET

The amount of curve is expressed in "R", the radius of its arc in millimeters. For a given display size, bigger numbers are tighter arcs, so 1,800R (the radius of many 27-inch curved displays) is shallower than 2,000R. Too much of a curve can be distracting, while too little may as well be flat. However, ignore all the talk of how "immersive" they are. They really aren't yet, at the very least because many games still aren't able to take full advantage of the nonstandard aspect ratios. On the other hand, unlike curved TVs, you'll always be sitting in the sweet spot, so glare shouldn't be an issue.

Many widescreen models tend to have a 21:9 aspect ratio, which means they're wider and shorter than other displays and full-screen video will be pillarboxed. But larger monitors without a curve at a more common 16:9 aspect ratio would require you to be bobbleheaded because they'd be quite tall: 24 inches (61 cm) high for a 49-inch monitor versus 19 inches (48 cm).

Should I get two screens or one ultrawide?

This really depends on what you're doing. For instance, if you want a really fast gaming monitor for play and a high-resolution display for work, it's a lot cheaper to get two than a single one that does both. Or if you need a color-accurate monitor for design but want a high-brightness one for gaming, it's also a lot cheaper to get two smaller ones. But if you just need a ton of screen space, a single ultrawide might be simpler.

Does the screen technology -- IPS, TN, and so on -- matter?

You don't really need to know anything about panel technology for buying a general-purpose display except that cheapest option TN (twisted nematic) isn't great, VA (vertical alignment) is somewhat better and that IPS and PLS (in-plane switching and plane-line switching) are the same thing and currently the best options. They do differ when it comes to specific needs, such as gaming or color-critical work. Almost all of them use LCD technology: You'll frequently see backlit LCDs referred to as LED-lit. These are not related to OLED displays, which haven't really materialized for the desktop due to various technical issues. Laptops are a different story.

The reason you generally don't need to think about the technology is because they tend to be expressed through specs and features, and those you do need to look at. Here are the relevant ones.

Color gamut

This is the total number of colors a monitor can display. It's frequently expressed as a percentage of a color space, which is an artificial construct that encompasses all the colors a device should be able to produce for a given purpose. Color spaces are really meant for use in color matching across devices that have different reproduction characteristics. For example, the Adobe RGB color space was designed to encompass real-world colors on a display for reproducing in print. sRGB was designed as a lowest-common-denominator standard for colors used by typical consumer monitors viewing the web. Displays with more than 100% sRGB are invariably anything but TN, and usually IPS.

Screen refresh rate

This is the number of times per second (in Hertz, or Hz) the screen can update, and affects motion blur and artifacts like tearing, which occur when the rate at which the graphics card is feeding the display and the display's refresh rate differ significantly. For any task in which frame rate (frames per second) matters, refresh rate may be an issue. (That predominantly means gaming, though high-frame-rate video editing or viewing may also be affected.) 60Hz is the minimum you want for comfort -- most monitors support that -- and 75Hz is comfortable for most nongaming uses. TN remains the best technology for getting stratospheric refresh rates: 300Hz or 360Hz are TN. But IPS panels can now hit 240Hz, which means there's a lot less of a tradeoff using them for gaming than there used to be. You can find everything you've ever wanted to know about refresh rate and more at Blur Busters.

Pixel response

This is how fast an individual pixel can switch states from black to white or (more commonly) from gray to gray measured in milliseconds. Faster is better, though only gamers tend to care, and you generally want a minimum of 5ms or less GtG for gaming. IPS and VA currently stand at between 3 to 5ms, and anything claiming about 1ms is TN. Fast-action esports generally still use TN because of the combination of high refresh rates and fast pixel response times.


This is the measure of the ratio between 100% and 0% brightness values. Higher contrast makes everything pop more. You definitely want to ignore the "dynamic contrast" types of specifications and concentrate more on anything listed as typical. Anything above 1,000:1 is fine, though I find 1,400:1 or so most comfortable for me.


This is how much light the screen can emit, usually as expressed in nits (candelas per square meter). Most desktop monitors run 250 to 350 nits typically. Screens that support HDR tend to start at 400 nits and run as high as 1,600. Laptop screens are different, because they need to be viewable in different types of lighting, such as direct sunlight, and therefore benefit from higher brightness levels even without HDR support.

Viewing angle

This is how far from off-center a screen can be viewed at, optimally, without serious changes in contrast or color. It remains one of TN's biggest weaknesses compared to other technologies.


All screen technologies shine a light through various layers of color filters and liquid crystal to produce an image except OLED. OLED uses organic materials that directly emit light in a spectrum of color frequencies, which is how it can be so thin and produce a wide color gamut. Most panels with backlights may display some artifacts, notably the appearance of light around the edges of a dark screen, known as backlight bleed. A new backlight technology, mini LED, lets a monitor use local dimming like a TV to produce high brightness with less bleed. (A standard full-LED array can do it as well, but not as effectively as mini LED.) Mini LED is used by the latest crop of HDR displays with brightness of 1,000 nits or more.

Does color accuracy matter?

Ballpark accuracy matters. If you shop online, for example, you want to make sure that cerulean blue shirt is roughly the same color you expect to get. And as long as a monitor has a less-saturated setting than vivid and any level of quality control, you don't need to worry about it. What tends to happen in practice, though, is that a monitor is tuned to produce the most accurate colors it was capable of when it left the factory floor. 

But that's not the type of accuracy manufacturers are talking about when they list specs like "Delta E < 2" or say it's Pantone Validated. What those mean -- or should mean -- is the monitor has been tuned and calibrated so that the difference between a set of color patches as displayed on the screen is the same within a small margin of error to a set of reference patches within the bounds of a specific color space. If that's the type of color accuracy that matters to you, it adds a whole additional layer of requirements and complexity.


With its 1,600 nits brightness, color-critical accuracy and expensive stand, Apple's Pro Display XDR is the poster child for high-priced desktop monitors.

Sarah Tew/CNET

How much should I expect to spend?

Other things being equal, a display tends to get more expensive as resolution, screen size, refresh rate, brightness and the number and type of features increases. Broader color spectrum, as well as niche capabilities for gaming or graphics will also boost the price. But you can get a strong general-purpose monitor for less than $300.

At the moment we're in a lull before products incorporating new standards are ready, such as HDMI 2.1, so if you're OK being behind the curve until you can afford something new, then don't worry. If you're going to beat yourself up in 2021 because you didn't wait for HDMI 2.1 or affordable 8K, then either wait or buy the cheapest model that will meet your needs to tide you over.

Can I use my old TV instead?

You can certainly drive a TV from your computer, but TVs are meant to be viewed from a distance, while computer displays are designed for closer work. As TVs get smarter and higher-resolution, though, the gap between the two is narrowing. Plus, for gamers, having a primary computer display for working and a TV hooked up for gaming may make sense. Want to do that? Here's how to use your 4K TV as a monitor.  


A trendy monitor feature from a few years ago was incorporating Qi wireless charging pads in the base. Some manufacturers still do it, though it never took off for the mainstream.

Josh Miller/CNET

Features to think about

Run-of-the-mill monitors may include speakers, USB hubs, slots for memory cards and more, as well as support features like picture-in-picture when hooked up to two systems. If you're short on desk space, you might want to consider a display with these types of integrated features. There are also whole classes of important features for gaming or color-critical work. 

Read more: PS5 and Xbox Series X can game in 8K resolution. Should you care?

What else should I consider?

How to shop for one

If possible, you have to see them in real life. I've headed out to buy a specific display based on the specs and ended up changing my mind when I got up close and personal with it. For example, displays with similar screen sizes can look or feel smaller or bigger than you thought, be more reflective or dull than you like, or it can be impossible to reach the connectors. As with  TVs , however, keep in mind that there are a few things that you can't judge in a store. The biggest is, sadly, image quality, which includes color rendering, brightness and black level. But you can tell if you find the screen readable and if you think it's ugly.

It's not always practical to see a product in person, though, so read user reviews carefully. It's difficult to sort out the meaningful complaints from the not-so-meaningful ones, but look for comments about build quality and dead pixels. Unfortunately, it's harder to gauge screen quality -- brightness, contrast, color -- from reviews, because everyone's eyes are different. Just make sure you know the return policies (including dead pixel allowances), remove it from the packaging as cleanly as possible and note how to repack it, just in case.

I admit, I'm a bit of a fatalist when it comes to support. The probability of having a good support experience from a manufacturer tomorrow seems to be completely independent of the experience you had with them today, and even good support from one division doesn't necessarily mean good support from another.   

What to expect in the box

At the bare minimum, you should expect an HDMI cable and a basic stand even with a cheap monitor. As the price rises so does the variety of cables bundled. The stand might not be an issue if you're planning to use the VESA mount to put it on a wall or arm. But in that case, you should ensure the mount screws on the back of the monitor match yours: The bulk of inexpensive monitors have 100-by-100 mm mounts, but in some cases, they don't support a VESA mount at all. 

More computer advice