X

Nvidia conference is all about the other processor

At its first-ever Nvision conference, Nvidia makes a case for the graphics processing unit, the other chip inside the PC.

Brooke Crothers Former CNET contributor
Brooke Crothers writes about mobile computer systems, including laptops, tablets, smartphones: how they define the computing experience and the hardware that makes them tick. He has served as an editor at large at CNET News and a contributing reporter to The New York Times' Bits and Technology sections. His interest in things small began when living in Tokyo in a very small apartment for a very long time.
Brooke Crothers
3 min read

SAN JOSE, Calif.--Nvidia is making a case for the graphics processing unit, the other chip inside the PC, at the Nvision conference that opened on Monday.

In his inaugural keynote--this is first Nvision conference--Nvidia CEO Jen-Hsun Huang reminded the audience that the graphics processing unit (GPU) has come a long way. In short, the GPU has evolved from the simple fixed-function graphics accelerator (e.g., the IBM 8514 that debuted in 1987) to the modern graphics chip, a computing engine capable of almost one teraflop of processing power. (A teraflop is equal to one trillion floating point operations per second.)

Huang, responding to an email query, made it clear that the GPU is complementary to the CPU, or Central Processing Unit. "It is not about replacing the CPU at all," he said. "We don't believe that replacing the CPU is a good strategy. Supplementing the CPU is far better." Intel is the world's largest supplier of CPUs.

In the keynote, Huang cited Stanford University's Folding@home program, a distributed computing project that uses about 2.6 million PCs--for a total of 288 teraflops of computing power--to study protein folding and misfolding. This is expected to deepen researchers' understanding of diseases like Alzheimer's and cancer.

Nvidia has released a version of the Folding@home program based on its CUDA development environment using more than 24,000 GPUs. Though this number represents less than 1 percent of the total processors in the Folding@home project, it provides 1.4 petaflops of performance, or nearly five times the processing power of all the CPUs in use by Folding@home. The researchers at Stanford hope that GPUs will significantly accelerate the time to discovery for the cures for many diseases.

Following this, Peter Stevenson of Realtime Technologies (RTT) gave a demo of real-time ray tracing used in auto design, in this case demonstrating a digital prototype of a new Lamborghini model. Ray tracing has been mentioned frequently by Intel over the last six months as a technique it would possibly use in the future. PC graphics technology today uses rasterization to generate images. (A discussion of ray tracing vs. rasterization here.)

Ray tracing can render three-dimensional graphics with extremely complex light interactions, allowing the creation of transparent surfaces and shadows, for example, with stunning photorealistic results.

This demonstration was followed by Joshua Edwards of Microsoft Live Labs. He gave a demo of Photosynth, which is based on the research of Noah Snavely and Steve Seitz at the University of Washington and Richard Szeliski of Microsoft Research. Photosynth uses dozens or hundreds photos of a place to reconstruct a 3D model and then displays a 360-degree perspective of the location.

Edwards showed how a series of photographs can be combined to create an interactive view of Stonehenge and the National Archive building.

Later, Huang showcased a technology getting a lot of buzz--3D stereoscopic graphics. (At the Intel Developer Forum last week, Intel announced a deal with DreamWorks Animation to enhance 3D cinema and bring 3D to TVs and other devices, which the two companies branded Intru3D.) The 3D stereoscopic demo showed 3D stereo clips from Nvidia's Medusa demo and Age of Empires.

Next up was Jeff Han of Perceptive Pixel. (Han's touch screen technology has been featured as the "Magic Wall" on CNN's Election Center coverage.) Han demonstrated his company's multitouch user interface technology using a 100-inch multitouch display, giving the audience a taste of what the UI of the future could become. Han said the current bottleneck to multi-user computing are antiquated input devices like the mouse. Han and Huang were able to simultaneously interact with one display, moving things around the screen and calling up objects with simple hand motions.

Initial applications are limited to military and high-end design, but the technology will trickle down into enterprise and home computing. (Microsoft Surface is an analogous example of this type of user interface.)