Gordon Moore on 40 years of his processor law

Once a year, the Intel co-founder speaks to the press. Here are his thoughts how Moore's Law has lasted four decades.

Tech Industry
Gordon Moore is one of the founding fathers of Silicon Valley and one of the few still alive.

His famous dictum turns 40 on April 19. He spoke to reporters recently about the electronics industry's progress, artificial intelligence, the emergence of China and the early days of the industry.

Q: So where did Moore's Law come from?
Moore: For their 35th anniversary issue, the editor of Electronics magazine asked me to write an article on the future of semiconductor components for the next 10 years. I wanted to get across the idea that integrated circuits will be the way to make things cheap. So I made this extrapolation. The biggest circuit available then had something like 30 components on it. I looked historically and saw we'd kind of gone four, eight, sixteen and we were about doubling every year. I didn't think it was going to be especially accurate; I just was trying to get the idea across that things are going to be significantly more complex and a lot cheaper, and it turned out to be much more accurate that I had any reason to believe.

Silicon Valley is still a great place to start a company in any of the information technology, biotechnology areas.

One of my friends, I believe, professor Carver Mead from Cal Tech, called this Moore's Law. The name stuck. I couldn't utter it for about 20 years, 'til finally I got reasonably comfortable with it.

In 1975, I updated Moore's Law, and we've been on that pretty much ever since. We're actually a little ahead of that, we're doubling in less than 24 months these days.

Were you aware of what all this doubling of components would mean and what it would allow people to create?
Moore: I reread the article and noticed I talked about things like home computers. I was surprised to find that myself. I didn't realize that I had predicted them in 1965. You have to put some kind of application in there. I think I also talked about electronic watches--you know, it's like saying we lack only the displays and (such). Unfortunately, Intel tried that business once.

Are things going to slow down, or will new materials allow the industry to break through the barrier?
Moore: Well, in the first place, I've never been able to see more than two or three generations ahead without seeing something that looked like a fairly impenetrable barrier.

With any materials made of atoms there is a fundamental limit where you can't go any smaller, and before that there'll be some kind of a limitation. To me, that'll really change the slope again. I changed it once from doubling every year to doubling every two years, and maybe we'll slow down to doubling every three to four years. After that they'll really make bigger chips. So there is a way out. At that time, we'll be putting several billion transistors on the integrated circuit.

Is there anything coming down the pike that could replace silicon?
Moore: Some of these other things, quantum dots and nanotechnology and that kind of thing--I will admit to being a skeptic around those things replacing mainstream digital silicon. You can clearly make a tiny little transistor by these techniques with potentially great high frequency, but can you connect a billion of them together? That's really the problem; it's not making a small transistor.

Our educational system is not what it ought to be, particularly when you are talking about K through 12. The universities are still great.

I view the technology that has developed around integrated circuits to be a fundamental way of building complex microstructures. Rather than being replaced, it's actually infiltrating a lot of other fields. You have MEMS and gene chips. Some of these microfluidic devices are little chemistry laboratories on a chip. (Silicon) is a very powerful technology that's going to be broadly used, and I don't see anything coming along like this and getting a reasonable chance to replace it.

That doesn't mean that a lot of the things being done won't be incorporated. I could imagine incorporating carbon nanotubes to the various metal layers, something like that, but I don't feel this is an alternative (to silicon transistors). In digital electronics, we've got an accumulative couple of hundred billion (dollars) invested in R&D.

How many times did people predict the end of Moore's Law, and how many times were you actually concerned it was going to happen?
Moore: It seems to me in the last 10 years I read a lot of articles that did. There was a time when I believed one micron was probably going to be the limit. We went through that so fast it wasn't a barrier at all. Then I thought a quarter of a micron might be, but it didn't stop it. Now we're below a tenth of a micron. Heck, we're doing one-sixty-fifth of a micron, and I don't see it stopping, short term anyhow.

What are people going to need this expanded computing power for?
Moore: In some significant scientific problems, people are really limited by the performance of the computing systems we have now. I was talking with Stanford professor Barbara Block, who sticks tags in marine animals. They collect all the data, and she is just getting buried in the information coming back. It's true for a lot of scientific areas though: They're producing data much faster than they can assimilate it.

A couple of months from now, Paul Otellini will take over as chief executive at Intel. What advice do you have for him or have you given him?
Moore: Well he hasn't asked for it (laughs). I think the recent reorganization of Intel into a platform focus is to a significant extent Paul's view of how he wants to work in the future. I think this is an important direction to go, as the various markets require different things. This assures that they get the proper attention. I think it was a very appropriate change.

Paul is different in that he is the first CEO of Intel that's not a Ph.D. manager-engineer. But he is more technical than I am at this stage of the game here. He ran the microcomputer division for a while and has had a lot of contact with customers.

When you look back, what products have inspired you to say, "Wow, that's a beautiful piece of work?"
Moore: Well, the ones I think of as landmarks were not necessarily beautiful pieces of work, but they turned out to be economically viable. The first dynamic RAM we made at Intel is with that category--the old 1103. It was a 1K DRAM and that was our first really big-revenue product. I guess I have to put the first microprocessor in that category too. It was very slow, but it did the job that it was designed to do. There've been a lot of things since that have been very important economically. I tend to think of them as more evolutionary products.

Will the additional computing power you get from following Moore's Law ever get us to computers with the equivalent of human intelligence?
Moore: Human intelligence in my view is something done in a dramatically different way than Von Neumann computers, and I don't think the route we're pursuing now is going to get to something that looks like human intelligence.

I do think, though, that eventually we will change our approach and do things much closer to the way they're done biologically and have a very big chance to get into something that looks for all intents and purposes like human intelligence. But I really don't think it's a simple approach. The amount of power that we would need to do everything the human brain does is probably more than we generate on Earth by our current approach.

Images: A man and his
law, then and now

Although chips are getting cheaper, chip-fabrication facilities are getting more expensive. How big of a problem will that be for the industry?
Moore: In the '60s, fabs were relatively cheap, so that wasn't much of a problem. Cost really became significant in the '80s--before that the labor and the engineering were the dominant factors. With fabs, now you look at two-and-a-half to 3 billion dollars. But the converse is you get so much more product out. When Intel started in 1968, we took a big lead to 2-inch wafers. Now we have, what, 12-inch wafers basically, six times the diameter, 36 times the area--so on one wafer you get a heck of a lot more stuff, and you get a much higher yield, too.

Besides coining Moore's Law, you're also sort of one of the founders of job hopping, going from Shockley to Fairchild to Intel. That must have been novel at the time.
Moore: Well, Shockley was the unique experience. He was a brilliant physicist, but he had very peculiar ideas of how people work, and he had some destructive practices. I had a reasonably good relationship with Shockley because I was a chemist, and he didn't think he had to know everything I do.

We got so disturbed that we went around Shockley to Arnold Beckman, which was the financial support, and asked him if we could somehow or other have Shockley removed from the management chain. We'd love to have him as a consultant or something, but bring somebody else in to manage the company.

We thought, we're making progress, but finally somebody got to Beckman and said that could ruin Shockley's career, so he changed his mind.

We felt we'd burnt our fingers so badly then we had to go look for further jobs. By accident, one of the group, Eugene Kleiner, wrote to a friend of his father's in the investment banking business, and they sent out a couple of partners. Well, one partner was a young Harvard MBA with the name of Arthur Rock. (Rock would later fund Intel and Apple Computer, among other companies.)

We talked to Rock and he said, "Well, you know what? We found a company for you (all) to work for." So we got this opportunity to set up a new company (Fairchild Semiconductor, within the larger Fairchild Corp.), and that was the formation of Fairchild. It was a group of eight of us.

But we learned that plenty of young kids will have a tough time pushing inside a big company. Fairchild went through two chief executives within a six-month period. Bob Noyce was the logical internal candidate, and he was being passed over, so he got kind of ticked off about that and decided he wanted to leave. I saw that my job would probably change significantly, so I said, "OK, I'll leave to set up Intel."

Fairchild had a lot of spinouts. Fairchild developed technology faster than it could exploit it. That started what I call the Silicon Valley effect, where any engineer with a new idea would obviously try to form a new company.

Paul Otellini is the incoming CEO, but Andy Grove still casts a huge shadow. Could you assess his career?
Moore: Andy says I am the only boss he ever had. If that's the case, I don't think he ever really had a boss (laughs). I hired him out of graduate school at Fairchild, and he rapidly worked his way up there and became assistant director of the laboratory. When I told him I was leaving Fairchild, he said, "I'm coming too."

Andy is a unique individual. He has changed careers several times along the way. When he came to Intel, we thought his role would probably be something like director of research. Then he got very interested in how organizations work, got over his Ph.D. and even wrote books on management from a technical point of view. When I stepped aside as CEO, he took over the job and he evolved to the role of industry spokesman. When he got over being CEO and became chairman, he started working with governments and had to just abandon what he was doing previously, and has done an excellent job.


Gordon Moore's famous
law has been around
for four decades. Here
are the key points
to know about it.

Do you personally use computers?
Moore: It seems like I still spend about half my time in front of one. E-mail is an important part of my life. I live half the year in Hawaii these days. If it wasn't for e-mail, I don't know how I would stay connected. There are spreadsheets; there's word processing work--those are the functions I probably use the most. I play simple games; I handle photographs.

I get very frustrated with the software sometimes. I hate how long it takes my computer to reboot when I have to shut it down for any reason.

How optimistic are you about the future of the U.S. technology industry?
Moore: Silicon Valley is still a great place to start a company in any of the information technology, biotechnology areas. Everybody is available locally. In Silicon Valley the biggest disadvantage is that it's expensive, especially the houses. It's hard to move young people in. Even with the recessions we've had, the price of housing has continued to increase. I don't understand how it can continue to go up.

The area's uniqueness is not as great as it was in the beginning. Now there's a lot of other places competing technologically. As far as my view of U.S. competitiveness, I think it's a real challenge. Our educational system is not what it ought to be, particularly when you are talking about K through 12. The universities are still great.

We (face) very formidable competition in the world--I think the impact of China in particular is just beginning to be felt; 1.1 or 1.2 billion people are going to have a dramatic impact. In the next 20 years, we're going to see how that plays out, I'm expecting the U.S. will still be a successful player, but I don't think it'll enjoy the position it had in, say, the past 20 years. We'll have to work hard at it. China is training 10 times as many engineers. Increasingly, they are a big consumer of the world's resources. Their technology is catching up very rapidly. It's a very entrepreneurial society.  

Autoplay: ON Autoplay: OFF