X
CNET logo Why You Can Trust CNET

Our expert, award-winning staff selects the products we cover and rigorously researches and tests our top picks. If you buy through our links, we may get a commission. Reviews ethics statement

nVidia GeForce GTX 295 review: nVidia GeForce GTX 295

nVidia GeForce GTX 295

Rich Brown Former Senior Editorial Director - Home and Wellness
Rich was the editorial lead for CNET's Home and Wellness sections, based in Louisville, Kentucky. Before moving to Louisville in 2013, Rich ran CNET's desktop computer review section for 10 years in New York City. He has worked as a tech journalist since 1994, covering everything from 3D printing to Z-Wave smart locks.
Expertise Smart home | Windows PCs | Cooking (sometimes) | Woodworking tools (getting there...)
Rich Brown
6 min read

ATI has given Nvidia some staunch competition on the 3D card front the past six months or so, but with the dual-chip GeForce GTX 295, Nvidia has raced back to the top of the performance pile. At $500 for a boxed version (from Nvidia's board partners), the GTX 295 is aimed at serious PC gamers, but it's also the best value among high-end boards, taking out the best chips from ATI. This card requires a beefy PC to run it because of significant power demands, but for anyone with the financial and electrical wherewithal to put the GTX 295 to work, you'll enjoy the best 3D hardware currently on offer.

9.0

nVidia GeForce GTX 295

The Good

Best single-card 3D performance available; more power efficient than its competition; PhysX support adds some bells and whistles to a few games; DVI and HDMI output.

The Bad

Still a power hog, despite its relative efficiency.

The Bottom Line

Nvidia's GeForce GTX 295 is the single fastest 3D card on the market, and for a relatively aggressive price. Added bonuses like power efficiency and PhysX support sweeten the deal, but even without those extra benefits, we'd still recommend this card for its processing power and comparative value.

Like its primary competition, the ATI Radeon HD 4870X2, the GeForce GTX 295 uses the familiar two-chips, one-card model we've seen from both Nvidia and ATI in the past. The Radeon HD 4870 X2 has been popular component in a few recent high-end gaming PCs, and with support for multiple graphics chips and graphics cards so prevalent in PCs these days, these dual-chip cards provide gamers with a relatively easy way to set up a quad GPU configuration.

The popularity of ATI's card had to do with the fact that it outperformed Nvidia's previous high-end behemoth, the $600 single chip GeForce GTX 280, for roughly $100 to $150 less. The GeForce GTX 295 closes both of those gaps, and also offers some noticeable power consumption savings.

AMD's aggressive pricing of its high-end Radeon cards surely contributed to Nvidia bringing the GeForce GTX 295 in for under $600. Nvidia suggested $500 as the starting price for this card, and retailers seem to be following that line so far. This is roughly the same as the price for stock Radeon HD 4870 X2 cards.

Comparing the speeds and specs above might at first glance seem to give the Radeon the engineering advantage over the Nvidia card. Nvidia uses slower, older RAM, and less of it, and both its core clock speed and the number of stream processors (the processing pipelines on the chip that handle various kinds of data requests simultaneously) are lower as well. We suspect Nvidia has two less obvious advantages at work that help its performance.

One is its manufacturing process. The GTX 295 uses two 55-nanometer GTX 200 graphics chips, and cramming two of its older 65-nanometer GTX 200 chips onto one card would have been a power consumption nightmare. We also have no information from ATI on the speed of its stream processors. Our suspicion is that they're significantly slower than the 1.24GHz stream clock on each chip in the GTX 295.

Far Cry 2 (ranch medium, DirectX 10, very high)
(Longer bars indicate better performance)
1,440 x 900  
1,680 x 1050  
1,920 x 1200  
Nvidia GeForce GTX 295
87 
76 
62 
Asus EAH4870X2
76 
63 
53 

Left 4 Dead (DirectX 9, 8x AA, 16x AF, very high)
(Longer bars indicate better performance)
1,440 x 900  
1,680 x 1050  
1,920 x 1200  
Nvidia GeForce GTX 295
161 
154 
141 
Asus EAH4870X2
162 
148 
133 

For some background on our 3D card testing methodology, we picked our test resolutions to correspond with the native resolutions of 19-inch, 22-inch, and 24-inch wide-screen LCDs. The only oddball is Crysis, which for some reason will support the 16:9 aspect ratio common to HDTVs, but not 16:10, common to wide-screen PC displays. These being the highest-end 3D cards on the market, we also picked the highest possible image quality settings for each game, with the exception of anti-aliasing. For AA we kept to 8x and avoided chip-specific anti-aliasing settings wherever possible, although the GeForce GTX 295 can hit up to 16x AA, depending on the game. We made a custom time demo for Left 4 Dead (before this week's patch, which unfortunately broke our demo file), but in all other cases we use built-in benchmarks, or in the case of Crysis, the downloadable Assault Harbor time demo included with Mad Boris' Crysis benchmarking tool.

Regardless of the technical explanation, the GeForce GTX 295 card was simply faster than the Asus EAHD 4870 X2 card on almost all of our 3D gaming tests. The only exception is a one frame advantage for the Asus card on the 1440x900 Left4Dead test. In fairness, the GTX 295's wins aren't by embarrassing margins either, but the Far Cry 2 scores in particular are large enough to be noticeable. Given the convincing lead by the GTX 295 across multiple game engines, in both DirectX 9 and DirectX 10, and at multiple resolutions, we're comfortable recommending it as the top single card you can buy.

Power consumption
(Shorter bars indicate better performance)
Load  
Idle  
Nvidia GeForce GTX 295
418 
196 
Asus EAH4870X2
447 
212 

We're also happy to point out that the GTX 295 is relatively power efficient compared with the Asus card. We say relatively because Nvidia's card still consumes more than 400 watts under load. That's more power draw for a 3D card alone than required by most budget desktops. Still, the GeForce card is measurably more efficient than the Radeon. This relative efficiency is another benefit of moving the GTX 200 chip to the 55-nanometer manufacturing process mentioned above. The GTX 295 requires one six-pin and one eight-pin connection to your PC's power supply, and because we'd recommend a 750-watt or better power supply to go with this card, we can't exactly argue that it's the greenest component out there. But if you must spend $500 or so on a top-end 3D card, the GTX 295 is at least greener than its competition.

There is, of course, more to the story of the GTX 295 and graphics cards in general. Nvidia has been particularly vocal about capabilities of its 3D cards beyond mere triangle processing. For example, Nvidia made a physics driver available earlier this week to allow support for its PhysX accelerated game physics software in the PC version of Mirror's Edge. The GTX 295 was able to handle the added processing work without a hitch, but we can't say we found the added effects that worthwhile. Yes, cloth and particle effects like shattering glass and smoke behaved more realistically, bouncing off surfaces and responding to your actions. But in very few cases do the added effects have more than a cosmetic improvement to the game experience, and even then they feel tacked on to the Mirror's Edge world (which already has a modular, impersonal feel).

This is not to say we're against PhysX, accelerated game physics in general, or Nvidia's other efforts to differentiate its hardware beyond simple frame rates. What we'll call the parallel programming effort, as represented by Nvidia's CUDA, Apple's OpenCL, ATI's Stream processing, and Microsoft's forthcoming parallel computing support in Windows 7, via DirectX 11, will likely affect commonly used software in the coming years, and we're excited to see what develops. But while Nvidia and ATI both offer some parallel processing in dribs and drabs now, we have yet to see an implementation of this capability that compels us toward one vendor's hardware over another's.

Finally, home theater enthusiasts (and even some PC LCD owners) will be glad to know that like our engineering sample, all of the retail GeForce GTX 295 boards ship with two DVI outs and an HDMI output. You still need to wire the audio signal from your PC's digital output to the card itself (a hassle ATI avoided by integrating an audio chip into all of its new 3D cards), but once that's done, connecting the GTX 295 to an HDTV or a projector should be simple.

Test bed configuration:

Windows Vista Ultimate SP1 64-bit; 3.2GHz Intel Core i7 965; Intel X58 chipset; 4GB 1,066MHz DDR3 SDRAM; 150GB 10,000rpm Western Digital Raptor hard drive

9.0

nVidia GeForce GTX 295

Score Breakdown

Design 8Features 9Performance 10