PALO ALTO, Calif.--AMD's pretty sure we all want better graphics on our PCs, and knowing us, they're pretty sure we don't want to cough up a lot of money to get it.
Phil Hester, AMD's chief technology officer, stopped by the Hot Chips conference here at Stanford University on Tuesday to talk a little more about Fusion, AMD's plan to integrate a graphics processor and PC processor onto the same chip. By the time the , Hester thinks the growing explosion of video and 3D graphics on PCs these days will require an affordable chip that still delivers great graphics performance.
"It's not about the silicon, it's about the applications," Hester said. He stepped back in chip history to liken the Fusion project to Intel's decision to integrate a floating-point processor into the 486 chip. There's always a cost trade-off when you integrate something new into a processor that was once done separately. But when enough applications need the extra performance, it's easier to justify adding some cost to dramatically improve performance.
That's the plan for Fusion. It's not going to replace high-end discrete graphics chips coveted by gamers, and it's not going to deliver the ultimate in CPU performance, Hester said. But AMD thinks that integrating the GPU will be essential around the end of the decade because so many applications--games and videos, for starters--will want to latch onto the GPU architecture and because the relative performance of a GPU is way beyond the CPU right now, he said.
GPUs and CPUs have traditionally been designed with different priorities in mind. GPUs are designed to sling code in and out as quickly as possible and are good at working with parallelized code, thefor multicore processors. Traditional CPUs tend to focus more on code quality and solving problems in sequential order. There's a distinct advantage to having both types of processors on a single chip, so long as AMD can ensure that developers can write code for Fusion and the company clears the integration hurdles, Hester said.
Lots of decisions need to be made. Should PCs with chips like Fusion use DDR memory or graphics DDR memory? AMD's integrated memory controller architecture will require a decision one way or another. How will AMD open up the GPU so more applications can be written to take advantage of its parallel-friendly architecture? AMD hasn't worked out all the details just yet, but open-source elements could help bridge the gap between applications and the hardware.
It doesn't make sense to integrate everything onto a chip, but sometimes it's worth it to take the plunge, Hester said. AMD is betting that it can improve overall graphics performance and still give developers a way to take advantage of the GPU architecture for other tasks, while holding down the fort with traditional application performance. It has two years to work out the details.