X

Intel details future graphics chip at GDC

Engineers are ready to spell out the inner workings and target markets for Larrabee, Intel's first graphics chip in over a decade.

Brooke Crothers Former CNET contributor
Brooke Crothers writes about mobile computer systems, including laptops, tablets, smartphones: how they define the computing experience and the hardware that makes them tick. He has served as an editor at large at CNET News and a contributing reporter to The New York Times' Bits and Technology sections. His interest in things small began when living in Tokyo in a very small apartment for a very long time.
Brooke Crothers
4 min read

On Friday, Intel engineers are detailing the inner workings of the company's first graphics chip in over a decade at the Game Developers Conference in San Francisco--sending a signal to the game industry that the world's largest chipmaker intends to be a player.

During a conference call that served as a preview to the GDC sessions, Tom Forsyth, a software and hardware architect at Intel working on the Larrabee graphics chip project, discussed the design of Larrabee, a chip aimed squarely at Nvidia and at Advanced Micro Devices' ATI unit.

Intel

And Nvidia and AMD will no doubt be watching the progress intently. Intel's extensive and deep relationships with computer makers could give it an inside track with customers and upset the graphics duopoly now enjoyed by Nvidia and AMD. In the last decade Intel has not competed in the standalone, or "discrete" graphics chip market where Nvidia and AMD dominate. Rather, it has been a supplier of integrated graphics, a low-performance technology built into its chipsets that offers only a minimal gaming experience. (In the 1990s, Intel introduced the i740 GPU which, in relative terms, was not a success.)

Forsyth said that there is not yet a Larrabee chip to work with--it's expected late this year or early next year--and that "a lot of key developers are still being consulted on the design of Larrabee." But Intel will offer ways for developers to test the processor, he said. "On the Intel Web site there will be a C++ prototype library. It doesn't have the speed of Larrabee but has the same functionality. Developers can get a feel for the language, get a feel for the power of the machine."

Beyond games, Intel is also trying to catch a building wave of applications that run on the many-core architectures inherent to graphics chips. Nvidia and AMD graphics chips pack hundreds of processing cores that can be tapped for not only accelerating sophisticated games like Crysis but for doing scientific research and high-performance computing tasks.

One of the largest test sites for Larrabee is Dreamworks, which will use Larrabee for rendering and animation. To date, Dreamworks had to wait overnight to get a rendering project completed. "Using (the) Nehalem (processor), Dreamworks can almost do it in real time and it is only going to better with Larrabee," said Nick Knupffer, an Intel spokesperson.

Larrabee is "Intel's first many-core architecture," Forsyth said. "The first product will be very much like a GPU. It will look like a GPU. You will plug it into a machine and it will display graphics," he said. (GPU stands for graphics processing unit.)

"But at its heart are processor cores, not GPU cores. So it's bringing that x86 programmable goodness to developers," Forsyth said. Larrabee will carry the DNA of Intel's x86 architecture, the most widely used PC chip design in the world.

Larrabee
ntel is touting the performance of Larrabee's vector unit. Intel

"It's based on a lot of small, efficient in-order cores. And we put a whole bunch of them on one bit of silicon. We join them together with very high bandwidth communication so they can talk to each other very fast and they can talk to off-chip memory very fast and they can talk to other various units on the chip very fast." In-order processing cores are used, for example, in the original Pentium design and in Intel's Atom processor.

"It's the same programming model they know from multicore systems already but there's a lot more of them," he said.

The centerpiece of the chip's core is the vector unit, used to process many operations simultaneously. "The interesting part of the programming model is the SIMD (single instruction, multiple data) vector unit and the instructions that go with it," Forsyth said. "We want to show off this big new vector unit and the instruction set."

Forsyth described what the vector unit can do and how it works with the scalar unit. "(The vector unit) can do 16 floating point operations every single clock. That's a lot of horsepower. Even in just one of these cores--and we have a lot of these cores. So it's a very high-throughput unit. The good thing is that it's independent of the scalar unit. You can issue instructions on the scalar unit and vector unit at the same time. The scalar unit is extremely useful for calculating addresses, doing flow control, doing housekeeping--and keeps all those miscellaneous tasks off the real powerhouse, which is the vector unit."

At GDC, Intel is encouraging developers to experiment. "They're going to have questions about how do I find 16 things to do at once. But a lot of it is just getting in there and playing with the thing," according to Forsyth. The GDC sessions will be a tour around Larrabee's instructions--"how to actually use these new instructions," he said.

And what about markets beyond gaming? "A funny thing happened on the way to the architecture. We designed this architecture to be 100 percent graphics focused. Whatever we needed to do to get graphics good, we did. And then a year ago, we looked at what we had and said how much of this stuff is actually specific to graphics. It turns out, very little. Graphics workloads are increasingly similar to GPGPU (general-purpose graphics processor unit), increasingly similar to high-powered (high-performance) computing. So, we actually have very little that is specific to graphics. Most of the instruction set is very general-purpose."