ASUS RT-N56U - wireless router - 802.11a/b/g/n - desktop
Dell UltraSharp U3011 - LCD monitor - 30"stars
Dell UltraSharp U3011
Seagate GoFlex Satellite - network drive - 500 GBstars
Seagate GoFlex Satellite
Iomega Home Media Network Hard Drive Cloud Editionstars
Iomega Home Media Network Hard Drive Cloud Edition
Nvidia's GeForce 7950 GX2 graphics card should be a PC gamer's dream. For a suggested price of between $600 and $650, the card gives you the same 3D gaming performance as you would get from two GeForce 7900 GTX cards at half the cost. And because Nvidia crammed two graphics processors onto a single card, you lose only half as much interior desktop real estate. But the problem is that the GeForce 7950 will be an entire generation behind in roughly six months. We can deal with the iterative updates that happen within a generation of 3D chips, but the forthcoming shift from DirectX 9 hardware--such as the GeForce 7950--to DirectX 10-based gear later this year presents too monumental a change to justify spending $600 on the card now. It's too bad, because there's a lot to like here.
Previously, when you wanted two 3D cards inside your PC (for the purposes of fast, high-resolution, high-detail gaming), you needed to buy two cards from ATI or Nvidia, plug each card into your PC's power supply, and join them together via either Nvidia's internal SLI connector or ATI's external CrossFire dongle. Through that process, you limit your expansion options--the bulky cards block adjacent slots--and increase the likelihood of requiring a massive power supply. Thanks to its all-in-one dual-3D-chip package, the GeForce 7950 GX2 solves all of those problems. A single GeForce 7950 GX2 still takes up two expansion slots' worth of space, but instead of losing two adjacent slots, now you lose only one. And because it requires but a single connection to your PC's power supply, you don't need the 600-watt, 750-watt, or 1-kilowatt power supplies that have become far too common. Instead, Nvidia recommends only a 400-watt power supply to power a single GeForce 7950 GX2, a marked improvement.
So it's neither the design of the card nor its requirements that hold us back, but rather the 3D chips themselves. The GeForce 7000 series has been a solid performer for Nvidia. It helped usher in the dual-card SLI era, and even though ATI's Radeon X1000's can jump through a few more hoops, I think most gamers would argue that this current generation of 3D chips has served the gaming public well. If that sounds like an epitaph, you're not far off. The problem is Windows Vista, or more specifically, Vista's updated multimedia programming interface, DirectX 10.
DirectX is Microsoft's Rosetta stone for combining hardware and software. As long as software programmers and hardware developers design their products to cue into DirectX, compatibility should be guaranteed. The current version for Windows XP is DirectX 9.0c. Microsoft has stated that Vista will support DirectX 9 software and hardware but that the OS will ship with DirectX 10. The GeForce 7950 GX2 (along with the rest of the GeForce 7000 series), however, is a DirectX 9 part. This means that while it will work with Vista, it won't support the latest and greatest 3D features, so your $600 investment will be out-of-date only six months after you buy it, assuming Microsoft hits its targeted January Vista release date. While Nvidia (or ATI) has yet to announce a DirectX 10 chip, you can bet that such cards will be out or will be very near release by the time Vista launches.
All of which is too bad, because the GeForce 7950 GX2 really is a fast 3D card. Its core specs aren't as fast as Nvidia's single-card flagship, the GeForce 7900 GTX. That card has a 650MHz clock speed for the GPU and 800MHz for the memory, whereas the GeForce 7950 GX2 has a 600MHz GPU clock and only 500MHz for each chip's 512MB of DDR3 RAM. We're not surprised by the clock speed reductions given the heat and power issues inherent to running two fast GPUs in a single-slot package, but we were surprised by the performance results. (Props to Sarju Shah, GameSpot's illustrious associate hardware editor, for sharing his benchmark scores with us).
(Higher scores indicate better performance)