X

Not just for gamers: New Nvidia Studio drivers deliver 30-bit color for Photoshop

Hath hell finally frozen over? GeForce cards can finally ditch the game-ready drivers and switch to the latest Studio update for creative-friendly features.

Lori Grunin Senior Editor / Advice
I've been reviewing hardware and software, devising testing methodology and handed out buying advice for what seems like forever; I'm currently absorbed by computers and gaming hardware, but previously spent many years concentrating on cameras. I've also volunteered with a cat rescue for over 15 years doing adoptions, designing marketing materials, managing volunteers and, of course, photographing cats.
Expertise Photography, PCs and laptops, gaming and gaming accessories
Lori Grunin
3 min read
14-razer-blade-advanced-2019

This model of the Razer Blade Advanced is morphing into a Studio version.

Sarah Tew/CNET

I never thought I'd see the day: Until today you had to spring for a pricey Nvidia Quadro workstation graphics card to properly view your shiny ray-traced renders or accurately grade HDR video in professional applications such as Adobe Photoshop and Premiere. Now that 30-bit support comes down to more affordable GeForce and Titan cards. And not just the RTX models -- "across all Nvidia product lines and GPUs."   

The latest Studio driver announcement from Siggraph comes in conjunction with news of more laptops added to its RTX Studio roster, though most of them were revealed at the Studio launch. There are two new Lenovos: the Y740 15 Studio Edition and Y740 17 Studio Edition, variations of its Legion Y740 gaming laptops but with better screens for creative work.

30-bit-display-pshop

Photoshop's "30 Bit Display" option is no longer a dummy checkbox for GeForce.

Screenshot by Lori Grunin/CNET

Photoshop has long given you the option to turn on a 30-bit color pipe between it and the graphics card. But if you enabled it on a system with a consumer-targeted GeForce or Titan graphics card, it didn't do anything. That's why there's always been such confusion as to whether you could display 30-bit color with a GeForce card. I mean, there's a check box and you can check it!

But Photoshop and Premiere use OpenGL to communicate with the graphics card, at least for color rendering, and the specific API calls to use deep color have only worked with Quadro cards. That can sting when you spent over $1,000 on a GTX 1080 Ti.

In its briefing, Nvidia made it sound like 30-bit-on-GeForce was a brand new idea inspired by Studio users' requests. Does that mean the company was intentionally ignoring all the previous pleas -- such as this one from its own forums in 2014?

It's possible Nvidia decided that it had bigger professional fish to fry with Quadro, including AI and big data, and decided that the advantages of letting GeForce support a previously limited-to-workstation capability would boost the professional credibility for its new Studio marketing push. That seems especially likely given the adoption of AMD's graphics on almost every hardware platform, as well as its high-powered exclusive partner, Apple .

Or maybe it's to allow game designers to work on an Nvidia graphics card that can actually play games without having to pay hundreds extra just to get the extra color depth, since GeForce and Titan hold up pretty well in the midrange 3D-acceleration department.

Watch this: Apple reveals new 6K display

To properly take advantage of this, you still need all the other elements -- a color-accurate display capable of 30-bit (aka 10-bit) color, for one. The ability to handle a 30-bit data stream is actually pretty common now -- most displays claiming to be able to decode HDR video, which requires a 10-bit transform, can do it -- but you won't see much of a difference without a true 10-bit panel, which are still pretty rare among nonprofessionals. 

That's because most people associate insufficient bit depth with banding, the appearance of visually distinguishable borders between what should be smoothly graduated color. Monitors have gotten good at disguising banding artifacts by visually dithering the borders between colors where necessary. But when you're grading HDR video or painting on 3D renders, for example, dithering doesn't cut it. 

And the extra precision is surely welcome when your doctor is trying to tell the difference between a tumor and a shadow on his cheap system. From Nvidia's own white paper in 2009: "While dithering produces a visually smooth image, the pixels no longer correlate to the source data. This matters in mission-critical applications like diagnostic imaging where a tumor may only be one or two pixels big."

Acer announces new Concept D line for creatives

See all photos