His famous dictum turns 40 on April 19. He spoke to reporters recently about the electronics industry's progress, artificial intelligence, the emergence of China and the early days of the industry.
Q: So where did Moore's Law come from?
Moore: For their 35th anniversary issue, the editor of Electronics magazine asked me to write an article on the future of semiconductor components for the next 10 years. I wanted to get across the idea that integrated circuits will be the way to make things cheap. So I made this extrapolation. The biggest circuit available then had something like 30 components on it. I looked historically and saw we'd kind of gone four, eight, sixteen and we were about doubling every year. I didn't think it was going to be especially accurate; I just was trying to get the idea across that things are going to be significantly more complex and a lot cheaper, and it turned out to be much more accurate that I had any reason to believe.
One of my friends, I believe, professor Carver Mead from Cal Tech, called this Moore's Law. The name stuck. I couldn't utter it for about 20 years, 'til finally I got reasonably comfortable with it.
In 1975, I updated, and we've been on that pretty much ever since. We're actually a little ahead of that, we're doubling in less than 24 months these days.
Were you aware of what all this doubling of components would mean and what it would allow people to create?
Moore: I reread the article and noticed I talked about things like home computers. I was surprised to find that myself. I didn't realize that I had predicted them in 1965. You have to put some kind of application in there. I think I also talked about electronic watches--you know, it's like saying we lack only the displays and (such). Unfortunately, Intel tried that business once.
Are things going to slow down, or will new materials allow the industry to break through the barrier?
Moore: Well, in the first place, I've never been able to see more than two or three generations ahead without seeing something that looked like a fairly impenetrable barrier.
With any materials made of atoms there is awhere you can't go any smaller, and before that there'll be some kind of a limitation. To me, that'll really change the slope again. I changed it once from doubling every year to doubling every two years, and maybe we'll slow down to doubling every three to four years. After that they'll really make bigger chips. So there is a way out. At that time, we'll be putting several billion transistors on the integrated circuit.
Is there anything coming down the pike that could replace silicon?
Moore: Some of these other things, quantum dots and nanotechnology and that kind of thing--I will admit to being a skeptic around those things replacing mainstream digital silicon. You can clearly make a tiny little transistor by these techniques with potentially great high frequency, but can you connect a billion of them together? That's really the problem; it's not making a small transistor.
I view the technology that has developed around integrated circuits to be a fundamental way of building complex microstructures. Rather than being replaced, it's actually infiltrating a lot of other fields. You haveand gene chips. Some of these devices are little chemistry laboratories on a chip. (Silicon) is a very powerful technology that's going to be broadly used, and I don't see anything coming along like this and getting a reasonable chance to replace it.
That doesn't mean that a lot of the things being done won't be incorporated. I could imagine incorporating carbonto the various metal layers, something like that, but I don't feel this is an alternative (to silicon transistors). In digital electronics, we've got an accumulative couple of hundred billion (dollars) invested in R&D.
How many times did people predict the end of Moore's Law, and how many times were you actually concerned it was going to happen?
Moore: It seems to me in the last 10 years I read a lot of articles that did. There was a time when I believed one micron was probably going to be the limit. We went through that so fast it wasn't a barrier at all. Then I thought a quarter of a micron might be, but it didn't stop it. Now we're below a tenth of a micron. Heck, we're doing one-sixty-fifth of a micron, and I don't see it stopping, short term anyhow.
What are people going to need this expanded computing power for?
Moore: In some significant scientific problems, people are really limited by the performance of the computing systems we have now. I was talking with Stanford professor Barbara Block, who sticks tags in marine animals. They collect all the data, and she is just getting buried in the information coming back. It's true for a lot of scientific areas though: They're producing data much faster than they can assimilate it.
A couple of months from now, Paul Otellini will take over as chief executive at Intel. What advice do you have for him or have you given him?
Moore: Well he hasn't asked for it (laughs). I think the recent reorganization of Intel into a is to a significant extent view of how he wants to work in the future. I think this is an important direction to go, as the various markets require different things. This assures that they get the proper attention. I think it was a very appropriate change.
Paul is different in that he is the first CEO of Intel that's not a Ph.D. manager-engineer. But he is more technical than I am at this stage of the game here. He ran the microcomputer division for a while and has had a lot of contact with customers.
When you look back, what products have inspired you to say, "Wow, that's a beautiful piece of work?"
Moore: Well, the ones I think of as landmarks were not necessarily beautiful pieces of work, but they turned out to be economically viable. The first dynamic RAM we made at Intel is with that category--the old 1103. It was a 1K DRAM and that was our first really big-revenue product. I guess I have to put the first microprocessor in that category too. It was very slow, but it did the job that it was designed to do. There've been a lot of things since that have been very important economically. I tend to think of them as more evolutionary products.
Will the additional computing power you get from following Moore's Law ever get us to computers with the equivalent of human intelligence?
Moore: in my view is something done in a dramatically different way than Von Neumann computers, and I