Intel considered buying graphics heavyweights Nvidia, ATI

Last year Intel considered making what would have been one of the largest acquisitions in its history, but Pat Gelsinger said the company eventually decided to forge its own path for graphics-processing technology.

As rival AMD was preparing to snap up graphics chipmaker ATI Technologies, Intel was considering topping AMD's offer or going after Nvidia, according to one of the company's top executives.

In an interview with The Inquirer, Pat Gelsinger, senior vice president and general manager of Intel's digital enterprise group, said Intel looked "pretty closely" at making a play for Nvidia or ATI, the two largest graphics chip companies in the world. Obviously, that never happened, as AMD closed its acquisition of ATI last year and Nvidia continues on as a standalone company.

Intel's Pat Gelsinger addresses attendees of the company's Fall 2007 Intel Developer Forum. Stephen Shankland/CNET News.com

Intel had some unique concerns that checked its ambitions, according to Gelsinger. "One issue was that we didn't know if we could because, if number one buys number two or three, what happens regulatory-wise?" Intel is the leading supplier of graphics technology for PCs because of its integrated graphics chipsets, and if it were to acquire a dominant share of the graphics market to augment its dominant share of the PC processor market, the U.S. government (well, perhaps the next administration) might have sat up and taken notice. And European regulators, currently hounding Intel on that continent, would almost assuredly have objected to the deal.

But graphics processors aren't just about rendering pretty pictures anymore. One of the real reasons graphics technology is attractive to both Intel and AMD is because graphics chips are very good at processing a stream of instructions at high speeds. That's why AMD bought ATI, and it has plans to integrate graphics chips directly onto a PC processor in 2009, a project known as Fusion .

Right now, those chips are designed to handle graphics data, but there's no reason why they couldn't be used for other applications that require high-performance computing, as long as the industry can figure out a way to program for those chips.

"The key transition (we're going through now) is in the graphics programming model," Gelsinger told The Inq. "The issue (GPU makers) have is making the pipelines more programmable, and we have the most programmable model on the planet--IA." IA (Intel Architecture) is Intel's term for the x86 instruction set; the company likes to remind everyone whenever possible that it came up with that idea.

Instead of teaching programmers how to exploit graphics chips, Intel's plan is to develop a project called "Larrabee" that will build a x86-compatible chip with the performance of a graphics chip. "Larrabee ends the debate on GPGPUs (general purpose graphics processing units)," Gelsinger said at the Beijing Intel Developer Forum in April. "This is what developers want." Both Fusion and Larrabee won't turn into products for a long time, so developers will have plenty of time to decide which model will prevail.

Check out the rest of The Inq's entertaining interview with Gelsinger, as well as the first two parts posted earlier in the week and the final part scheduled for tomorrow.

About the author

    Tom Krazit writes about the ever-expanding world of Google, as the most prominent company on the Internet defends its search juggernaut while expanding into nearly anything it thinks possible. He has previously written about Apple, the traditional PC industry, and chip companies. E-mail Tom.

     

    ARTICLE DISCUSSION

    Conversation powered by Livefyre

    Don't Miss
    Hot Products
    Trending on CNET

    Hot on CNET

    The Next Big Thing

    Consoles go wide and far beyond gaming with power and realism.