X

It's Moore's Law, but another had the idea first

On its 40th anniversary, there is some friendly dispute about a similar observation by the inventor of the mouse. Images: The man and the law

4 min read
One of the cornerstones of Silicon Valley will mark an anniversary Tuesday.

Forty years ago, Electronics magazine published Gordon E. Moore's celebrated article predicting that the number of transistors that could be placed on a silicon chip would continue to double at regular intervals for the foreseeable future.


Named Moore's Law several years later by the physicist Carver Mead, that simple observation has proven to be the bulwark of the world's most remarkable industry.

Yet Moore was not the only one--or even the first--to observe the so-called scaling effect that has led to the exponential acceleration of computing power that is now expected to continue at least for the next decade.

Before Moore's magazine article precisely plotted the increase in the number of transistors on a chip, beginning with 1, the computer scientist Douglas C. Engelbart had made a similar observation at the very dawn of the integrated-circuit era. Moore had heard Engelbart lecture on the subject, possibly in 1960.

Engelbart would later be hailed as the inventor of the computer mouse as well as the leading developer of many technologies that underlie both the personal computer industry and the Internet.

In a 2001 interview, Engelbart said that it was his thinking about the scaling down of circuits that gave him the confidence to move ahead with the design of an interactive computing system.

"I was relieved because it wasn't as crazy as everyone thought," he said.

Significantly, the two pioneers represent twin Silicon Valley cultures that have combined to create the digital economy.

Moore, who co-founded Intel, is an icon of the precise and perhaps narrower chip engineering discipline that today continues to progress by layering sheets of individual molecules, one on top of the other, and by making wires that are finer in diameter than a wavelength of light.

"Gordon was the classic engineer," said Craig Barrett, Intel's chief executive, who had just begun to teach engineering at Stanford University when Moore made his famous prediction.

The chart that accompanied his article was a plot that showed just five data points over seven years and extrapolated out into the future as far as 1975, when a single chip would be able to hold as many as 65,000 transistors. Forty years later, memory chip capacity has gone far beyond 1 billion of the tiny switches.

Augmenting the human mind
Engelbart, in contrast, was the architect of a passionately held view that computing could extend or "augment" the power of the human mind. His ideas were set out most clearly in 1968, in a famous demonstration in San Francisco of his Pentagon-financed Augment computing system. Many things were shown to the world for the first time, including the mouse, videoconferencing, interactive text editing, hypertext and networking--basically the outlines of modern Internet-style computing.

Engelbart had an epiphany in 1950, in which he imagined what would decades later become today's Internet-connected PC. He set about building it. At the time he had no idea of how he would

build such a machine, but it soon became clear that it would require a computer that did not yet exist.

Later he was offered a job at Hewlett-Packard, but when he learned that the company had no plans to enter the computer business, he went to work instead at Stanford Research Institute, now SRI International.

There he worked with a group of military-funded researchers who were attempting to build magnetic-based computing circuits. The military was interested in the technology because of its potential performance in outer space.

With the invention of the integrated circuit in 1959, however, the group realized that its work would soon fall by the wayside.

Thinking about the idea of miniaturized circuitry, Engelbart realized that it would scale down to vastly smaller sizes than the current electronic comments. He had that insight because earlier he had worked as an electronics technician in the wind tunnel at the Ames Research Center, a NASA laboratory in Mountain View, Calif. There, aerodynamicists made models and scaled them up into complete airplanes.

It was an easy conceptual leap to realize that integrated circuits would scale in the opposite direction. In 1959 he put his ideas into a paper, titled "Microelectronics and the Art of Similitude." In February 1960, he traveled to the International Circuit Conference in Philadelphia. There he explained to his audience that as chips scaled down, the new microelectronic engineers would have to worry about changing constraints, just as aerodynamicists had to worry about the macro world.

Will the chandelier fall?
One person who has a clear memory of Engelbart's description is Moore, although he does not remember whether he heard him speak in Philadelphia or elsewhere.

"The thing that I remember from it is his question if we would notice anything different if everything in the room was suddenly 10 times as large," he wrote in an e-mail message. "He answered it by suggesting that the chandelier might fall."

Several historians pointed out that Engelbart's previous observation did nothing to detract from the significance of Moore's careful plotting of the trend.

"It still should be called Moore's Law rather than Engelbart's Law," said Michael Riordan, a historian of physics at the University of California, Santa Cruz. "Science is still based on theory and experiment."

As for Engelbart, the 1959 paper convinced him that the Augmentation machine he envisioned would be possible, because computing would be plentiful in the future.

He was one of the first to grasp the implications of the new technology. Years later he recalled in an interview that he told his Philadelphia audience, "Boy, are there going to be surprises over there."