Tech Industry

A payoff on AMD's 64-bit bet

A technology gamble appears to be paying off for Advanced Micro Devices CTO Fred Weber. Will it be enough to once and for all rid AMD of its underdog tag?

As the chief technology officer for Advanced Micro Devices, Fred Weber has been responsible for the design of the processors millions of PC enthusiasts favor, including the Athlon XP.

More recently, Weber has helped steward a strategy to give AMD's server business an edge against larger rival Intel. The gist of the effort is to extend PC and server processors, sometimes called x86 chips, to 64 bits, and thus boost the performance of relatively inexpensive servers.

The idea appears to be paying off, helped in no small degree by AMD's decision to let its 64-bit Opteron server chip also run 32-bit software. The company, traditionally thought of as the underdog in the server chip business, has so far inked deals with IBM, Hewlett-Packard and Sun Microsystems, which have all adopted the Opteron chip for their product lines. Weber recently spoke with CNET about the strategy and future of 32- and 64-bit chip design.

Did you have a hard time, convincing (former AMD chief executive) Jerry Sanders that the jump to 64 bits would be necessary?
Not really. You know, Jerry has always been a fierce competitor and a visionary, and he saw the value of this right away. He had seen that Intel was on the wrong path. As soon as it became clear to him that something like this was the right path, he became immediately very passionate about how important it was.

So you're saying compatibility, or the ability to run both 32-bit and 64-bit software, is the right way to go?
Exactly. AMD64 architecture allows you to run your existing code with ever-better performance, while giving you a path to new code without compromising performance and capabilities. That is the path that does not create unnecessary disruption.

It took quite a while, though, to get it out into the market.
Well, you know, it is much like any of these developments. You are juggling a lot of balls all at once. We had a number of software efforts that we had to get done, and all of this was happening in parallel. It wasn't just inventing the new instruction set architecture, AMD64, but we also had to invent HyperTransport and develop onboard memory controllers.

By the time you get out a decade or so, certainly, 64 bits will be appropriate, even in what we think of as the smallest devices today.

At the same time, we were still fighting through the DDR versus Rambus nightmare. That was yet another fight in which we had to take a different path from Intel and fight uphill for what turned out to be the right final direction.

All of those balls were in the air simultaneously. We had to create simulators to allow Microsoft and the Linux community to port their operating systems before we had hardware. Somewhere in late 1999, I pulled together a team of about 20 key people and took them all out to dinner, and we declared that we would work together to make this thing happen.

What do you say to corporations that might still be a little nervous about adopting AMD chips? It is still a fairly new chip, in terms of server standards.
People do know that we have quite a good reputation, frankly, for quality and reliability that's been built up over the years. Many of the chief information officers of the world and so on use AMD Athlon processors at home. There is a bedrock of support for us out there, in terms of that sort of thing.

A year ago, when we launched this stuff, the question was whether we were going to have any first-tier customers in the server space at all. As of today, we have got three of the four main server suppliers using our processors in enterprise-class machines. I think that that comes as a surprise to some that we have done that well.

So far, at least some of the Opteron servers that have been sold have gone into things such as clusters. Do you worry about being pigeonholed?
No. You have to ask each of the vendors what their reasoning is. If you look at the press that Sun has made, it's quite clear that this is very much at the heart of their strategy and meant to be used in many classes of computational problems. Different companies have their own different strategies of where they're taking it...but no, I do not think we have been "niched" at all. We're being rolled out over time--that's what's really happening.

Companies like HP or IBM have 32-bit servers based on Intel chips and very high-end 64-bit machines based on their own chips--although HP is moving to Itanium over time. AMD is in the middle as sort of an either/or.
We are very happy with where we are positioned right now. I think you are right that essentially, Intel is currently positioned at the low end of the servers, the proprietary stuff is at the highest end, and there's us in the middle. But of course, we come from a heritage in which we can certainly cover--both from price points and history--the low end as well. We have got a beautiful story for taking the x86 processor architecture as far as it can go over time into all the aspects of the enterprise.

And what about the strategy to extend the x86 architecture that AMD talked about last summer?
I probably should have said AMD64 rather than x86. As devices get more and more software-rich, the value of the x86 instructions in them gets higher and higher. We are going to be making x86 processors to suit more devices and move out of traditional computer devices and data-processing devices into appropriate aspects of consumer electronics and other areas where the software load is getting fancier and fancier.

Like the Geode processor?
The Geode is a great example; we've got a processor with very low cost and very low power but still tremendously nice performance and a very good road map.

Unless you're a power user, you don't absolutely need 64-bit software today.
That made it possible to hit price points and form factors that had not been open to the x86 in the past.

Do those devices need 64 bits as well? Or do you draw the line somewhere?
Some devices do. To the extent that you're into devices that are very heavy on video processing--which certainly many consumer electronic devices are--64 bits is a big advantage. Where you're more into communications, it is premature to put 64 bits in at this time. By the time you get out a decade or so, certainly 64 bits will be appropriate even in what we think of as the smallest devices today, just as we've moved from 8 to 16 to 32 bits in microcontrollers.

What about the folks who say there is just not enough software available yet to bother with 64 bits?
Some people might ask, "Do you want 64-bit software today? Do you absolutely need it?" The answer is, "Unless you're a power user, you don't absolutely need it today." But does anyone question that in three or four years, you'll absolutely want it? I think the answer is no.

That's enough incentive to move to Athlon 64?
What's the highest-performing 32-bit computer? It's the Opteron; the Athlon 64. It has the advantage of being more future-proof than a pure 32-bit (computer). There is nothing wrong with also making a 32-bit decision for some of the computers you buy. That is why we are still selling Athlon XP as well.

Does it not limit AMD's ability to grow, as people are going to keep the computers longer?
Any indication that people are keeping computers longer seems to be mostly countered by the fact that they are also deploying more and more computers in their homes. If you look at the projections for PC shipments over the next couple years, the growth rates are a little down from the peak, but it's still healthy growth. It's not that you won't buy new computers, but you will (also) keep the old ones longer. So longevity is a very important consideration.

Aren't you also working on lower-power notebook chips and chips for portable electronics that will also offer AMD64 technology?
Well, we have a wide range of different devices that are both in the market and coming to market. At the really low end of the market, which is the sort of sub-5-watt sector, we have the Geode line of processors that are appropriate for lot of the ultra portables and set-top boxes. There are really no immediate plans to take those to 64 bits. You know those tend to be limited-memory devices and do not immediately need 64-bit. But over time, that will come down there as well.

Over what, 10 years?
Less time than that.

When it comes to Opteron, what do you expect to see from that chip? Will four- and eight-processor systems help AMD make a splash in the market?
I think that we will remake the four-way space by bringing new levels of performance and affordability. We will do some pretty interesting stuff with our eight-processor system as well. You will start to hear some interesting things in less than a year about larger systems as well.

We have talked a lot about 64 bits, but I'm sure there are other areas you're looking at. For example, companies are facing difficulties with power consumption and current leakage, as they scale to smaller manufacturing processes.
I think anybody who's at the lead of technology has to deal with very hard problems to keep the road map moving forward. When you shrink devices to these sorts of levels, we start talking about as few as 10 or 15 atoms of material in a single layer. It is very, very hard to build these devices. Of course, at the same time, you've got a couple hundred million of these devices, and every single one of them has to be right for the machine to work properly.

We're running up into a lot of the limits of material science. There are many areas where we are pushing the boundaries of material science and silicon technology. I think we, Intel and IBM all face tremendous challenges to move this forward.

Is Moore's Law going to run out? Intel has said, basically, that it's got about 10 years left in it.
The way I would describe it is we see our way clearly to about 10 more years. I don't know if anybody has ever predicted that it has more than 10 years to go. I do not know if anyone has ever predicted it has less than 10 years to go. Tune in next year.

Is there a Weber's Law?
Well, I'd say Weber's Law is, "Don't forget everybody's law." Moore's Law is very valuable, and Amdahl's Law should not be forgotten, when people talk about the value of parallelism. As technology moves forward, substitute uses of that technology become real. You really have to look at all of these laws in order to understand where products are going and not just be driven by just one law.

That gets interesting, when you have multicore processors. Is that where things are headed?
There is a lot of interest in multiprocessing, and it is well founded. Whether you'll see lots of little tiny processors or fewer big processors--or maybe a mix of both--frankly, time will tell.