During a February earnings conference call, Jen-Hsun Huang, president and CEO of Nvidia, repeated one thing over and over: graphics are in and the central processor is out. There is some truth to this. And Intel's plans for future silicon technology address this head on.
Pat Gelsinger, general manager of the digital enterprise group at Intel, spelled out Intel's strategies for future graphics technology on Monday. He addressed the higher-octane technology that will be built into future "Nehalem" processors and the highly sophisticated "Larrabee" chips that will be offered as "discrete" or standalone products.
First, some perspective. Intel--not Nvidia or ATI--is the world's largest supplier of graphics chips for PCs. The reason is simple. Intel-integrated graphics silicon is shipped in tens of millions of PCs every year. It's a low-cost--and relatively low-performance--solution that many PC vendors opt for. But that doesn't mean Intel is the premier supplier of sophisticated mainstream PC graphics technology. That distinction goes to Nvidia and ATI. Intel is a non-player. This is evidenced by the proliferation of Nvidia- and ATI-based graphics board reviews at enthusiast Web sites and the bigger role that graphics processors from these two companies play in handling increasingly complex visual applications.
And, as the Nvidia CEO has intimated, unless Intel responds aggressively, this could make Nvidia a direct Intel competitor in the future. Nvidia's newest GeForce 9600 GT GPU rivals, at the very least, Intel chips in complexity. It has 64 stream processors--each individually clocked at 1625MHz--and a 256-bit memory interface running at 900MHz and contains more than 500 million transistors.
To address this, Intel intends to boost integrated graphics performance in Nehalem processors and, for the first time, offer a discrete (standalone) graphics product for high-end markets. Both Nehalem and Larrabee are targeted at the 2009-2010 time frame.
So, how will Intel improve Nehalem integrated graphics? Not surprisingly, more transistors and more bandwidth, according to Gelsinger. "Largely, integrated graphics is as much die area as you can throw at it and as much memory bandwidth as you can give it," Gelsinger said. "So, could we equal discrete graphics performance with integrated graphics? Of course." Gelsinger went on to say that Intel will focus on "more transistor budget, leading-edge process technology, and more memory bandwidth dedicated to integrated graphics."
Logistically, this will be accomplished by turning today's three-chip CPU into a two-chip CPU, he said. That means moving the graphics silicon onto the same die with the main processor. More specifically, the part of the chipset referred to as the "north bridge" is going away. The north bridge contains the memory controller and graphics controller. Both of these components will be moved onto the CPU die. The other part of the chipset referred to as the "south bridge" will remain separate. This includes I/O related components.
But Gelsinger said there are definite limits to what can be done with integrated graphics because of the big power and transistor requirements for high-end discrete (standalone) graphics products. They have "a very different price point and die envelope and power envelope. Some of the (discrete) graphics chips alone are 150 watts. We build whole platforms for less (power) than that," he said.
This is where Larrabee comes in. Gelsinger said that Larrabee--a "many core" chip--will target Nvidia and AMD/ATI's discrete graphics. "Obviously, if we're going to be competing in the discrete graphics marketplace, we think we're going to have to compete well...in terms of traditional benchmarks like 3D Mark," he said, adding that Intel will support traditional graphics interfaces such as DirectX and OpenGL. A big potential plus: since Larrabee cores will be based on the Intel Architecture, developers who already write code for standard Intel microprocessors can develop for Larrabee without learning a completely new architecture.