X

Needs of big firms foretell Intel, Nvidia battle

Large Intel customers state what graphics chips suppliers need to do to make graphics chip technology more palatable for high-performance computing.

Brooke Crothers Former CNET contributor
Brooke Crothers writes about mobile computer systems, including laptops, tablets, smartphones: how they define the computing experience and the hardware that makes them tick. He has served as an editor at large at CNET News and a contributing reporter to The New York Times' Bits and Technology sections. His interest in things small began when living in Tokyo in a very small apartment for a very long time.
Brooke Crothers
2 min read

As Intel prepares to invade Nvidia turf, large companies at the Intel server chip rollout Monday stated--in some cases quite objectively--what graphics chip suppliers need to do to make this technology more palatable for high-performance computing.

Lincoln Wallen, head of research and development at DreamWorks Animation
Lincoln Wallen, head of research and development at DreamWorks Animation Screen capture by Brooke Crothers

Besides competing in the gaming graphics market, Intel is eying large high-performance computing customers such as Dreamworks Animation (whose "Monsters vs. Aliens" opened last weekend to large box office numbers) for its future Larrabee graphics chip.

Nvidia is already a player in the so-called General Purpose GPU space, which applies graphics processing units (GPUs) to high-performance computing. As described by Nvidia, high-performance computing on the GPU uses a CPU and GPU together in a heterogeneous computing model, with the "sequential" part of the application running on the CPU and the computationally-intensive part running on the hundreds of processing cores built into the GPU.

Application developers have to modify their application to take the compute-intensive kernels (core components of an operating system) and map them to the GPU. The rest of the application remains on the CPU.

At the Intel "Nehalem" server chip event on Monday, a panel of representatives from large companies addressed the issue of CPU versus GPU. Currently, these customers are using CPUs to do their data crunching.

Keith Gray, manager, high performance and technical computing at oil giant BP, spelled out why he has hesitated to use GPUs to date while expressing interest in adopting them in the future. "Our business is about accelerating our development of new seismic imaging research algorithms. At this point we actually believe the level of programming difficulty (and) lack of standardization of application development tools make the move to accelerated computing a bit risky," he said.

CPU (left) versus GPU
CPU (left) versus GPU Nvidia

Gray continued. "We are watching the evolution of the programming interfaces. Once those are better standardized, once the issues of moving data back and forth from the general purpose system to an accelerator is addressed, we'll be very interested in taking advantage of it," he said.

Lincoln Wallen, head of research and development at DreamWorks Animation, is also looking into exploiting power of the GPU for tasks such as rendering. "We're looking forward to exploit more flexible compute models, perhaps involve more of the graphics processing functionality but tightly coupled with very powerful CPUs to address the particular way in which we generate images, very soft body, lots of geometry generation," he said.

Wallen continued that, as he sees it, Larrabee offers an advantage because of its tight coupling between the CPU and GPU. "The promise of Larrabee with that tight coupling and the programming model offers a great opportunity to start to explore that type of architecture for our particular workloads," he said.