In an era in which resume inflation seem the rule rather than the exception, Thacker's achievements still command special attention. During the course of his work at Xerox's Palo Alto Research Center, he helped pioneer two of the biggest advances: the PC and network connectivityThacker was the chief designer of the Alto, the first personal computer to use a bit-mapped display and mouse. Along with Bob Metcalfe, he also was the co-inventor of the Ethernet local area network. More recently, Thacker had a major hand in the design of Microsoft's Tablet PC.
CNET News.com recently caught up with Thacker after he received the von Neumann award, bestowed by the Institute of Electrical and Electronics Engineers.
Q: You went to (University of California at) Berkeley, where you studied physics. Did you start off thinking that would lead to a career as a physicist?
Thacker: Actually, I had a very definite plan. I wanted to design particle accelerators. I wanted to be not a theoretical physicist but an experimental physicist and I had worked with that before. I worked at the Caltech Synchrotron Laboratory, as a matter of fact. So I was very definitely preparing myself for a career doing that.
Can you step me through how that led to your involvement with Project Genie?
Thacker: My wife and I were quite happily married, living in Berkeley, and I decided to take a year off between undergraduate and graduate school and I went to work for a friend who was in Berkeley at that time. He was a guy named Jack Holly, who used to run a company called Berkeley Instruments. I said, "OK, Jack, I'll tell you what I'll do. I'll work for you for a year and I will design electronics for you because you're not good at it and you can teach me to run all your machine tools because I'm not good at that. Pay me something, but you don't have to pay me a whole lot of money."
About nine months later, a friend of his dropped by one day who was a representative for an electronics component company. He mentioned that the Genie Project was looking for a staff engineer and I said, "What's that?" He said "Well, it's a pretty interesting computer project." So I went up and talked to them and we hit it off and I went to work for them--and as a result I never went to graduate school.
That's a great story. While you were there, you also had a chance to know J.C.R. Licklider.
Thacker: Yeah. I never really worked with Lick, who by then was retired, but he visited our lab many times and I got to know him.
When you left to go to Xerox PARC, you went with Butler Lampson and Peter Deutsch, who worked with you on the Genie Project, and you guys became the core of the research team over there?
You spent about 12, 13 years at PARC. Nowadays the story of what went on there is referred to in almost reverential terms. What was the more gritty reality on the ground? When you arrived how was it organized? Was it organized, was it a free-for-all or was it somewhere in between?
Thacker: Well, I was approximately employee No. 1. It was very interesting because there was basically nothing. It was Bob Taylor and me in an empty building. So we figured out what we're going to do. I was mainly interested in the infrastructure aspect, not the computer science part. At that time I thought of myself very much as an engineer.
So we realized that we needed a time-sharing machine and the one we wanted to get was a PDP-10 (programmed data processor, model 10) made by Digital Equipment. Unfortunately, Xerox had just paid a large amount of money for Scientific Data Systems, and wanted us to use the machine they were making. We looked at it as a vehicle for a time-sharing machine and decided it wasn't very good. So we built our own, one that was functionally identical (to the PDP-10), but was quite different in implementation. And that took about a year.
We had been interested in smaller machines. We had some machines which were actually personal machines, but they never went very far because they didn't have very good displays. They use an old display technology, where the CRT beam was guided around by magnetic or electric field to actually draw characters and things like that, but they were pretty crummy characters. They are the sort of things you still see today in air traffic control centers. What we wanted to do was build a machine that could be really a personal machine. I mean, you can call it a personal computer.
You're talking about the Alto now?
Thacker: That was the Alto. And once we had the Alto we realized that the system would be much more powerful as they were put together with a network, and so we did the Ethernet.
Again, describe what you and the rest of team were thinking. Did you grasp the potential of what you had created?
Thacker: Oh yeah. We did understand Moore's Law. We realized most of the implications and that was why we did something which was very daring for the time--which was the bitmap display. We realized that it used three-quarters of the memory of the machine to draw one frame, but that was going to get cheaper real fast. And so it made sense to actually build the software along that paradigm and go through the trouble of getting good fonts and so on.
Once we had that, we also had a technology that had been developed by a guy name Gary Starkweather, who couldn't find a home for it inside the East Coast part of Xerox and so he came to PARC and did most of the optical and electromechanical engineering on the laser printer. We did the electronics. That was kind of a marriage made in heaven because that is the thing that Xerox made a bundle on.
So why didn't the folks at Xerox realize what they had on their hands? So many of the key inventors--you included--finally just said, "Enough, we're outta here."
Thacker: The problem was that Xerox was a copier company and at that time Xerox's main business was under attack because of the Japanese getting into the low-end copier business. I think it was a combination of lack of vision and the fact that we were kind of far away and so we didn't get much face time with the executives in Stamford (Conn.). As a result, they really didn't understand what we were doing and they were also not computer people. Most of what they saw they didn't really understand. The other reason, of course, was at the time that we did all this stuff, it was really too expensive.
A few years later when you went to join Digital Equipment, DEC was a very hot company, but within a few years, it also was under pressure because of the emergence of the PC. Again you had history repeating itself.
Thacker: That's right.
But this time it was Ken Olsen who was slow to realize things were changing.
Thacker: Yeah, Ken Olsen missed it too. To be fair about that, we had workstation machines at DEC and we made them and they were much better than the PC. But they were still too expensive. Ken's comment about why would anybody ever want a computer in their home--that was just misguided. A lot of us knew that computers in your homes were good because we had computers in our home and we used them for work. We played games too. They weren't as exotic as the games you can buy today, but they were there.
What's been most surprising to you about the way the technology industry has evolved during these past 35 years? Have you given much thought to the subject of social networks?
Thacker: Well, I've been a little bit surprised by some of that stuff. I mean some of it is very scary. But I'm really too old to have a valid opinion about that sort of stuff because those things are really for young people. The unfortunate thing is they tend to be for young people who don't go out and don't really have a social life. So they have a sort of a second social life on Second Life, and that's a kind of sad. But who am I to judge?
You also helped to design the tablet PC. The concept still has not won over the mass public. How do you see the trend likely playing out over the next five to 10 years?
Thacker: Well, it's actually interesting because when we actually built the tablet at DEC in 1993, we were fairly familiar with what the problems were going to be--and we knew they would be severe. The main one being battery life. The handwriting recognition had to work very well in order for it to be accepted without a keyboard, and I think we came very, very close. I think there's a lot more evolution that is possible there...So in some ways the tablet has been disappointing. But on the other hand, I'm heartened by the fact that it is still a growing market. It just grows every year a little bit, and it's the only tablet-like computing product that it has ever lived in the market for as long as it has. It's now six years old.
I'd like to get your thoughts on the quality of computer scientists coming up through the ranks these days. There's been a lot of controversy about the quality of math and science education. Some people say that it's gotten to the point where the U.S. is in danger of losing its technology edge to other nations.
Thacker: I am actually quite disturbed by this trend. I worked on a project to try to figure out how to actually use computers more effectively in education in the lower grades because a lot of people now work on improving university-level education and maybe high school. That's not where the problem is. The problem is in the very first exposure of a kid to education. It's a hard slog because the education market is worse than the medical market in terms of fragmentation.
As a computer scientist, what do you see as the next big challenge, the next big hurdle for computer science?
Thacker: It is getting to be the case that you cannot build a more complex chip. You can't actually keep up with Moore's Law the way we have been doing. For 35 years the Intels and the AMDs of the world have been turning Moore's Law into more performance--that is, higher speed. Unfortunately, one of the things about that is, it also increases the power and you're now pretty much up against the limit of the amount of heat you can remove from one of these chips.
So we need to look around and see what to do...The tradeoff is that it looks like the manufacturers are going to want to increase the number of processors on one silicon chip rather than increasing their complexity of a single processor. The problem with that is that we don't know how to program it. We just aren't very good at concurrence. One of the things that I say to my academic friends is that--to some extent--this is your fault because we hire a lot of computer science graduates with bachelor's degrees. They have not learned anything about parallel programming.
So, we need a combination of better education and we need better models for programming...If we could develop better abstraction that would be fine, but we're in the dark right now. I think that's the big challenge going forward.