X

Intel's employee No. 12

Ted Hoff was part of a team that created the Intel 4004 microprocessor, paving the way for the personal computer. How does it feel to be a founding father of a multibillion-dollar industry?

John G. Spooner Staff Writer, CNET News.com
John Spooner
covers the PC market, chips and automotive technology.
John G. Spooner
7 min read
When Busicom contracted with Intel in April 1969 to create a series of custom chips for five upcoming calculators, neither company realized it was creating the template for what ultimately came to be the personal computer.

Neither did Marcian "Ted" Hoff, a member of what became a three-man team that guided the launch of the Intel 4004 microprocessor. Hoff, Stan Mazor and Federico Faggin are credited with designing the chip.

What made the 4004 different was its flexibility. Instead of being hardwired for a certain task, the chip worked with software to perform its duties, reducing cost and adding never-before-seen flexibility to the design of computing devices.

A Stanford Ph.D. and former research associate, Hoff--Intel employee No. 12--was on the ground floor as part of a triumvirate of Intel employees who helped steward the technology's conception, design and birth as a product: the first general-purpose microprocessor.

But as Hoff recalls, "It wasn't quite as organized as that. The initial role I had was to provide app information and possibly suggest (usage) for Intel's product line."

His role turned into a much bigger one when he was asked for feedback by Intel founder and then-President Robert Noyce. Hoff would suggest the concept for the instruction set--the language used by the chip to encode and decode data.

Q: How does it feel to be a computing legend?
A: It's very satisfying to see the acceptance (of the microprocessor). The people in the media seem very much aware of desktop and mobile as applications for microprocessors. But when we launched (the 4004), our view was embedded control. In fact, when we announced the product, we could not sell it in a calculator. So it was used for elevator controllers and gas pumps. The personal computer market wasn't ready because of the lack of other peripherals you'd need.

The typical hard disk of the day was a platter that was maybe 15 or 16 inches in diameter, would hold 1MB or 2MB, and cost $10,000 or $15,000. It didn't make sense to save a few hundred dollars on the microchip if you had to spend $10,000 for a printer. The typical hard disk of the day was a platter that was maybe 15 or 16 inches in diameter, would hold 1MB or 2MB, and cost $10,000 or $15,000.

Embedded control sort of launched the market and provided for the platform to be advanced. It spurred the industry to develop lower-cost printers and lower-cost hard drives.

Can you walk us through the 4004 project and how a chip was developed at the time?
It really didn't start off as a project. It started off where I was doing liaison work with the engineering team. I had seen the inside of computers. But the idea of a desktop calculator that was going to be a relatively inexpensive device I had not seen, and I was really curious. The calculator project for Busicom came about when it was felt it might be reasonable to do some custom work. The first such project they took on was this calculator project for Busicom. I was assigned just to be a liaison for them, not to provide any design work.

What was your input?
(Later) I became concerned about the design. Bob Noyce asked for suggestions. I came up with the idea that...a general-purpose computer could be programmed to perform specific tasks needed for the calculator. For example, the original design called for logic to run the printer. But I said, "Let's come up with programming to run the printer." Everything you move (from logic) to programming means you put it into ROM (read-only memory), which you got pretty cheaply. Logic required more area and more design time, and it committed you to that particular printer. Whereas as if you wanted to change the printer later...you could just write a new program.

Meanwhile, the instruction set I had (proposed) was pretty much from scratch. I had the idea of, "Let's try to get a really primitive instruction set, but then we can concentrate on optimizing its performance." I had the idea that I wanted to start with 4-bit instructions. Both Stan Mazor and (Masatoshi) Shima (a Busicom engineer at the time) made some suggestions. But the basic outline was mine. That led to the architecture for what became the 4004.

What phase of the development was that?
Mazor helped me complete the architectural phase of it. The design was then transferred over to the MOS design group. That was headed by Les Vadasz (now head of Intel Capital). Later, Federico Faggin joined. He got three chips out and done in less than a year from when he joined. But I got it launched, you might say. Originally the design was quite different. It was to be a chipset for a family of calculators.

What happened next?
We had a semi-formal proposal that our marketing people actually sent off to (Busicom) as an alternative approach. They chose our design. They liked the flexibility of it. That's what got it launched. Once it was approved...Mazor and I did the support activity. We laid out circuit boards, so you could build small computers (for example).

How does the design process of the 4004 differ from creating a chip today?
So much was done by hand, whether it be the logic design or the computer instruction set...trying to test the thing. We didn't have the simulation capability that we have now. It was very primitive and it was very slow to develop a program because it took so long to compile and debug the program.

Our biggest fear was the minicomputer. The worry was that people...that had worked with the minicomputer would find the microprocessor was an order of magnitude slower. Of course, the tools for simulation and checking have been tremendously improved, so that you cannot only do the design, but check it out to get as many bugs out as possible (before "tape out," when the chip hardware is first produced for testing). It's a classic science-fiction scenario...where computers design the next computers and make them better.

Did you have any idea what kind of revolution the 4004 would start?
Probably not from the point of view of the PC. But the way we looked at it was, as engineers, this thing really helps me solve a number of design problems for this gadget I'm building. I was involved in building an EPROM (erasable programmable read-only memory) burner--for programming data into an EPROM (a previous burner used hardwired logic)--but it's so much easier to do the (burner) design with a microprocessor. Not only that, but then it's more flexible if you need to upgrade when...a new EPROM comes out. So it was much easier to upgrade with a microprocessor than it was with hardwired logic.

We felt if we felt that way, then there were a lot of other engineers out there that would accept it.

What was your primary concern?
Our biggest fear was the minicomputer. The worry was that people might use that approach or that people that had worked with the minicomputer would find the microprocessor was an order of magnitude slower. But there were a lot of applications where speed wasn't a factor and where cost was a factor (where the microprocessor had a large cost advantage). As the underlying technology improved, we were able to make faster and faster processors.

Other than the 4004, what was the most important chip in PC history--the 8088?
Probably the biggest step was when the 8080 came out. That became the first of the microprocessors that really was right up there, comparable with the minicomputer speed. (At that point the microprocessor became a better choice, because it was cheaper.)

What other chips did you work on?
I provided architectural input on the 8080. But by that time (Intel) had a whole team of designers, and Shima had come over from Japan and joined Intel. He did most of the chip design for that.

I did some work on a bipolar processor family. Then Bob Noyce came around and asked me to look at an entirely different area, the telephone industry, to see what we could do there...We developed the first monolithic telephone codec.

I left in the beginning of 1983 to go to Atari.

What do you think when people say today's PCs are fast enough?
People often say that. Why do you need more processing power? They argue you can only type so fast, but I think that sometimes they forget that there are other applications besides word processing for which computers can be used. Even trying to do entertainment (such as streaming media) in real time takes a lot of computing power.

What do you think the next big innovation will be?
I can see a number of areas where things could be developed. Some of the things I'd think the use of computing power would be of use in is language translation. I believe some work is available there already, but (work needs to be done) to improve it.

If we take (language translation) out of the realm of written language and move it into speech, like on the telephone--that's an area I'd like to see developed. But it's more than just the computing power. It'd take better speech recognition. That's been limited by computing power, but that should get better.

The realm of entertainment, music and video synthesis and compression could benefit from increased computing power. You can expect to see the (compression) algorithms improve as computing power increases.

How does Intel stand these days, in your opinion?
I'd like to think the future is rosy. Intel has fantastic technology and capability for design. I think the main thing, though, is that they've had really good management. Way back when we were first coming out with the microprocessor, I remember attending a meeting and it was brought up that a lot of microprocessors were being sold through distribution, and the feeling was we needed to improve our visibility (the ability to interact with distributors). (Management) wanted to know how much was in inventory...At the time I don't think I appreciated the importance of that.

It's the kind of management that's looking out for the problems and (saying), "Let's try to anticipate them before they bite us." It's that kind of attitude that Intel has done very well by.