X

The Gizmo Report: NVIDIA's GeForce GTX 280 GPU-- gaming

NVIDIA launches the GTX 280 graphics processor, and Peter Glaskowsky gets some hands-on time with a graphics card based on the new chip. Here's how it compares with a pair of older ATI Radeon boards on four popular PC games.

Peter Glaskowsky
Peter N. Glaskowsky is a computer architect in Silicon Valley and a technology analyst for the Envisioneering Group. He has designed chip- and board-level products in the defense and computer industries, managed design teams, and served as editor in chief of the industry newsletter "Microprocessor Report." He is a member of the CNET Blog Network and is not an employee of CNET. Disclosure.
Peter Glaskowsky
6 min read

Graphics performance improves rapidly. We can be confident that each new generation of graphics chips will be faster than the previous one, and that AMD and NVIDIA will regularly surpass each other with new product launches. I've been watching this process professionally since 1996, when I began covering graphics technology for Microprocessor Report.

NVIDIA's GeForce GTX 280 graphics chip
NVIDIA's GeForce GTX 280 graphics chip NVIDIA Corporation

As of today, NVIDIA is on top. The new GeForce GTX 280 is the fastest graphics chip you can get. See the first part of this review for details of the chip itself.

If you can get one, anyway. NVIDIA says boards based on the GeForce GTX 280 and its companion GeForce GTX 260 will be available "in quantity" tomorrow (June 17), but if previous launches are any indication, those quantities won't be enough to satisfy everyone.

And you may not be able to afford one-- a GTX 280 board with 1GB of RAM will likely be priced around $649, while GTX 260 boards with 896MB will go for about $399. (The GTX 280 / 1GB board I tested was made by NVIDIA, so it isn't necessarily representative of commercial products.)

But avid gamers won't be discouraged by these prices. Both AMD and NVIDIA like to point out that an expensive graphics card is a much better investment than a high-end CPU or motherboard if you care about gaming.

The standard of comparison for gaming performance is the number of frames per second that can be rendered for a given combination of screen resolution and quality features... or, conversely, what resolution and features can be used without reducing the frame rate below a playable level.

So in my own testing, I used frame rate as a metric for games that could run acceptably with maximum quality at the maximum resolution of my monitor (1,600 x 1,200 pixels), and quality for other games.

I did my testing with four games:

Company of Heroes, from Relic Entertainment

Assassin's Creed, from Ubisoft

Age of Conan: Hyborian Adventures, from Funcom

Crysis, from Crytek

(Age of Conan was provided by NVIDIA with the GTX 280 board. I got Company of Heroes at a previous NVIDIA event. I bought the other titles, as well as several others I won't describe here.)

The system I used for testing was a 2006-vintage Core 2 Duo system based on an Intel D975XBX motherboard and a 2.93 GHz processor overclocked to 3.2 GHz. It was originally equipped with dual ATI Radeon X1900 XTX PCI Express graphics cards connected as a Crossfire pair, which delivers almost twice as much rendering performance for a single display. This configuration was about as good as gaming systems could be in late 2006.

I set up all of the games on this system in its original configuration, then replaced the ATI graphics cards with the one NVIDIA GTX 280 reference board.

Company of Heroes dates back almost two years, and it shows. The game looks pretty good, but it was no match for the Radeon Crossfire arrangement. Even with all quality features set to their maximum levels, the game could still produce an average frame rate of about 60 fps (frames per second) using its internal benchmarking test.

Assassin's Creed is more recent-- the PC release I tested came out just a couple of months ago-- but it also played well on the Radeon boards. The game produced good results with all available quality settings maxed out. Oddly, the Windows Vista Games Explorer, which displays "minimum" and "recommended" requirements, says that my test system doesn't measure up to the recommended requirements of this game.

Age of Conan is the most recent game in the set. This multi-player online game had its full release on May 20. Although originally expected to support version 10 of Microsoft's DirectX graphics, the game shipped with only DX9 support. In spite of this, the game is very graphics-intensive and looks very good. The Games Explorer recommendation was met by the ATI hardware, but the game still wouldn't play well with maximum quality and resolution settings. I did most of my testing with the ATI cards using "medium" quality, maximum resolution (the 1,600 x 1,200-pixel limit of my monitor), and no antialiasing (a technique for producing smoother, more realistic edges on objects).

Crysis, which provides the most advanced graphics support of the games I tested-- and perhaps of all games available today-- also required "medium" quality settings on the ATI cards and no antialiasing. With these settings, I still encountered moments in the game when the screen updated very slowly. Although still playable, this was the only game that was not entirely satisfactory on the 2006 hardware.

NVIDIA's GeForce GTX 280 reference board
NVIDIA's GeForce GTX 280 reference board NVIDIA Corporation

Once I had some baseline measurements and observations for the Radeon graphics cards, it was time to swap in the GTX 280. This was not as easy as it should have been for a number of reasons, including a minor mechanical problem with the card itself. The biggest problem I had was that the GTX 280 reference board-- like the chip itself-- is huge. It's like the monolith from "2001: A Space Odyssey" with a pair of DVI connectors at one end. It's two slots wide because of the fan and heat sink required to deal with the board's 236W power rating.

Yes, 236 watts. That's what we call "thermal design power" (TDP), the maximum amount of power that is likely to be consumed in normal operation. Still, that's in line with other high-end graphics cards, and NVIDIA says it greatly reduced the idle power consumption of the card, which helps save energy during ordinary operation.

Another problem with the GTX 280 was its requirement for two additional power connections-- one six-pin plug and one eight-pin plug. Both are defined in the PCI Express specification and found on current high-end PC power supplies.

My test system had two of the six-pin plugs for the two original dual-slot Radeon cards, but I fashioned a short cable to adapt one of those plugs to the eight-pin PCIe socket. Since the eight-pin socket actually only has three power contacts, just like the six-pin plug, such an adapter will normally work fine, and in fact I had no problems with this arrangement. But my recommendation is to upgrade your power supply instead.

Once the new board was installed and working properly, I was able to run through the games.

Company of Heroes and Assassin's Creed really didn't look or work any better on the GTX 280 than on the Radeon cards, which is what I expected. Any game that fits within the limits of an older graphics card simply doesn't have room to improve on a newer model.

With the GTX 280, Age of Conan could be played with maximum quality and anti-aliasing enabled, producing significant improvements in visual quality during gameplay. Still, I don't think I'd have replaced the graphics card just for this game, even if I spent most of my life in it-- as I expect some people will do.

The real payoff for the new card was in Crysis, where the GTX 280 made the "high" quality settings practical. As good as the GTX 280 is, however, Crysis can still demand more than the card can deliver. The full display resolution was only achievable with antialiasing turned off, and even then, I was only getting about 40 frames per second in the game. At 1,024 x 768-pixel resolution, I could enable four-sample antialiasing. This produced a more pleasing visual appearance but less fine detail.

NVIDIA's GeForce GTX 280 graphics card in a 3-way SLI arrangement
NVIDIA's GeForce GTX 280 graphics card in a 3-way SLI arrangement NVIDIA Corporation

True Crysis addicts will likely want to use multiple GTX 280 cards using NVIDIA's SLI technology, which (like ATI's Crossfire) lets multiple cars work together to drive a single monitor. Up to three cards per system are supported, but that would require a heck of a system to provide enough PCI Express bandwidth and power, and a lot of money as well. That's about $2,000 worth of graphics cards alone.

Like Age of Conan, Crysis looks great on the GTX 280. The graphics still aren't lifelike, but it's getting easier and easier to ignore the shortcuts taken to produce real-time 3D and focus on the gameplay. Interestingly, neither of these games really seemed to stress the GTX 280 even though they were running near the card's limits in some respects. The fan on the card never seemed to be very loud. That could just be a tribute to the fan, I suppose, but I've used plenty of dual-slot graphics cards over the years and some of them have been loud enough to drown out the sound effects from the games.

The GTX 280 is good for more than just gaming, however. It's also capable of accelerating video playback, encoding, and scientific processing. I'll talk more about these applications and related issues in the future.