X

Nvidia, AMD gaming graphics buck green-PC trend

The chipmakers offer really fast graphics boards for enthusiast game PCs--and the result is new boxes that can literally blow a fuse.

Brooke Crothers Former CNET contributor
Brooke Crothers writes about mobile computer systems, including laptops, tablets, smartphones: how they define the computing experience and the hardware that makes them tick. He has served as an editor at large at CNET News and a contributing reporter to The New York Times' Bits and Technology sections. His interest in things small began when living in Tokyo in a very small apartment for a very long time.
Brooke Crothers
4 min read

There is an ungreen revolution taking place in enthusiast game PC circles.

A 1,250-watt power supply--this one from Cooler Master--is the largest a game PC maker will install today.
A 1,250-watt power supply--this one from Cooler Master--is the largest a game PC maker will install today. Cooler Master

The eye-opening graphics possible on today's game PCs come at a cost: light-dimming power consumption. The trend, rooted in the perennial quest for more speed, bucks the overall greening of the PC industry.

Green PC designs have become more than just practical; they're cool. Power-sipping Netbooks are in, as are small desktops like the Dell Studio Hybrid and Hewlett-Packard Pavilion Slimline.

This is not the case for high-end gaming PCs, where bigger is better. How far this trend can go isn't clear, but a seminal event in Apple's history may offer a lesson. In 2001, Apple unveiled one of the first dual-processor consumer systems, based on the overheating-prone IBM PowerPC G4 processor. The original Apple tower design had a Rube Goldberg feel to it, with a host of fans straining to rid the system of heat. A noise like that emitted by a wind tunnel, generated by the power supply and fans, forced Apple to redesign the system.

This symbolized why Apple eventually abandoned PowerPC: The platform wasn't efficient with power.

Fast-forward to 2008. Game rig makers are cramming as many as four graphics chips into high-end boxes that are notable not only for performance but also for the power they consume. As a consequence, big power supply units are in vogue. Today, bragging rights extend to the units themselves: some systems boasting boutique brand names such as Cooler Master and SilverStone draw 1,200 watts--roughly three times the power requirements of game systems a few years ago.

It's an ominous trend, according to box makers. "If this trend does continue, then, yes, it will give us problems," said George Yang, an engineer at Los Angeles-based game rig maker IBuyPower. "A regular home user would have to have an electrician come in, get the outlet out, and plug in a higher breaker," Yang said. Today, some of the higher-end systems with big power supplies require a special wall power socket, according to Yang.

Other game rig makers are equally concerned. "I swore that I'd never break 1,000 (watts)," said Kelt Reeves, president of game PC maker Falcon Northwest. "Unfortunately, that's been the solution for the past several years. Bigger, bigger, bigger power supplies."

"A regular home user would have to have an electrician come in, get the outlet out, and plug in a higher breaker."
--George Yang, IBuyPower engineer

Reeves says that 1,200 watts is now essential for gaming systems based on multiple boards from Nvidia or AMD's ATI graphics unit. "With three GTX 280s or two of the R700 cards, we're recommending they go with a 1,200-watt power supply," Reeves said, referring to the newest graphics chips from Nvidia and ATI respectively.

This is just about the limit, he said. "We can't go too much more over that before--if you actually pull that (power)--you start tripping the client's household circuit breaker."

Neither Nvidia nor ATI show any signs of slowing down, according to Reeves. "Eventually these chips get so hot that their own heat becomes a barrier to performance," he said.

Nvidia admits that its chips are drawing more power than before. "If we go back about three years, our graphics card power was in the 120- to 130-watt range," said Jason Paul, product manager in charge of enthusiast GPUs (graphics processing units) at Nvidia. "The GTX 280 which we launched a couple of months back, it's around 230 watts (of) graphics card power," he said.

But Paul claims the performance per watt is the key yardstick, not raw power. "Where you see a little under 2X increase in maximum power, you've seen probably 3-times or 4-times (the) increase in the level of performance. So, overall we see a substantial improvement in performance per watt. This is the big metric we track to ensure we're delivering efficient architectures. "

Paul says Nvidia has implemented power savings techniques on its GTX 280 that keep the power down when it's not running at top performance loads. "With the GTX 280 at idle, that card runs at about 25 watts, which is one-tenth of its absolute worst-case power," he said. Nvidia also offers hybrid graphics technology that turns off all the power-sucking boards when they're not in use.

Dell XPS 730 game box uses special liquid cooling to control heat
Dell XPS 730 game box uses special liquid cooling to control heat. Dell Computer

Moreover, Paul says that the multiboard systems are limited to a small niche at the very top of the market. "There's definitely a segment of the market that wants more and more performance. Remember, however, that this is the ultimate performance (segment)."

But game box makers ship many--if not most--of their systems to the very niche that Paul is describing. "We're all about the high end. The higher-end the graphics card is, and the more expensive, the more we sell," said Reeves.

And the trend in power supplies exemplifies how this market has changed. "The power supply used to be just silver box, and nobody gave it a second thought," he said. "(But) as graphics cards have evolved, they have forced the power supply makers to keep providing more and more power pipes--or cabling--to the graphics cards"--increasing the unit's complexity, he said.

Reeves cites GPUs, not CPUs from Intel, as the culprit. "The latest CPUs use very little wattage. If you overclock a 3GHz Intel CPU to 4GHz, you might pull 40 more watts. Whereas a graphics card, you put three of them in a system, they'll pull 800 watts running some of the higher-end games," he said.