Myths of Moore's Law

Most computer cognoscenti think they know what this law states, but CNET News.com's Michael Kanellos says the 11 words in the dictum make for one of the most misunderstood statements in all of technology.

Michael Kanellos Staff Writer, CNET News.com
Michael Kanellos is editor at large at CNET News.com, where he covers hardware, research and development, start-ups and the tech industry overseas.
Michael Kanellos
5 min read
Moore's Law is only 11 words long, but it's one of the most misunderstood statements in technology.

The basic rule--which states that the number of transistors on a chip doubles every 24 months--has been the guiding principle of the high-tech industry since it was coined by Intel co-founder Gordon Moore in 1965.

It predicts technological progress and explains why the computer industry has been able consistently to come out with products that are smaller, more powerful and less expensive than their predecessors--a dynamic curve that other industries can't match.

Still, most people manage to mangle the rule, one way or another. Many people, for instance, state that Moore's Law says the number of transistors doubles every 18 months--a time frame never laid down by Moore.

Others claim that Moore came up with it while driving down Highway 101 in Silicon Valley. (He says he came up with it while preparing an article for Electronics magazine.)

Worst of all, many postulate that Moore's Law is in danger of running aground because the world no longer needs more powerful computers.

For example, the magazine The Economist theorized on May 8, that the rule was becoming irrelevant, partly because Google CEO Eric Schmidt said the search company relies on less-than-cutting-edge servers. "The industry is simply too efficient," he said. And Kim Polese, founder of corporate software company Marimba, was one of the software executives who told The New York Times that the rule's force was petering out, because people wanted to spend less time at work and more time with their families.

In a bit of magazine performance art, Red Herring ran a cover story on the death of Moore's Law in February--and subsequently went out of business.

Most people manage to mangle the rule, one way or another.
These theories, though, ignore one of the key driving factors inside the famous rule, which is this: People aren't following it out of the good of their heart.

Moore's Law, after all, is not a law of physics. It is merely an uncannily accurate observation on what electrical engineers, when organized properly, can do with silicon. Companies that can keep their tech teams humming will reap profits and power. Those that can't will fade away.

One way to view the rule in action is through the history of the 1GHz chip. Both Advanced Micro Devices and Intel released 1GHz microprocessors during the first week of March 2000. At the time, analysts claimed the chips offered more performance than people needed. In fact, the chips probably still offer more than most consumers need.

So why didn't Intel just quit spending billions on new factories and advancing its processor line? Because AMD wouldn't.

When it debuted, the 1GHz Pentium III cost $990 in volume quantities, and Intel had around 80 percent of the market. Flash forward three years. AMD's most-powerful chip provides around 3.2GHz of performance and costs $464, while its cheapest processor gives 2GHz of performance and costs $66. Meanwhile, Intel's least-expensive desktop chip, a Celeron, runs at 2.1GHz.

Had Intel remained pat, it would be the one losing millions of dollars per quarter while AMD would be the one controlling more than 80 percent of the market.

Had Intel remained pat, it would be the one losing millions of dollars per quarter while AMD would be the one controlling more than 80 percent of the market.
PC makers that stuck with Intel would have gone to the glue factory as well. A Dell Dimension released in 2000 with the 1GHz Pentium III, 256MB of memory, a 30GB hard drive, a CD-RW drive and a DVD player cost $5,999. That's six times more expensive and about one-third as powerful as a midrange box Dell Computer released this week.

Google, meanwhile, would have had to triple the real estate it leases for its server rooms, or to run at a crawl, if it really had jumped off the Moore's Law curve. Hard-drive manufacturers, software developers and even content sites face the same inexorable dilemma of "improve or die." When you think of it, Moore's Law isn't about progress; it lays down the rule for the arms race.

Granted, it's a mathematically mind-boggling concept. If Moore's Law were applied to urban growth, a 250,000-person city such as Reno, Nev., would be as large as New York City within a decade. In another decade, the city's population would hit 256 million--almost double that of Russia.

In terms of size--under Moore's Law, chips get smaller over time--Reno would cover less than one-third of the square mileage it has today. But it wouldn't be flat anymore: There would be six levels of city stacked on top of one another. The fireplace in the Fireside Lounge at the Peppermill would probably emit almost as much energy as a hydrogen bomb.

Invite Michael Kanellos into your in-box
Senior department editor Michael Kanellos scrutinizes the hardware industry in a weekly column that ranges from chips to servers and other critical business systems. Enterprise Hardware every Wednesday.

Companies have found the pace torrid and expensive. Initially, Moore predicted transistors would double in number every year; in 1975, he slowed it down to every two years. Shrinking the size of transistors has also lead to outrageous capital equipment budgets. Rock's Law holds that the cost of a semiconductor fab, or fabrication facility, will double every four years. Now, an average fab costs $3 billion, and most companies can't afford to stay in the market as independent operators.

The laws of physics will likely begin to slow down the pace of Moore's Law over the next decade. Ultimately, the rule may have to be looked on as a generalized guidepost for performance improvement. Progress, meanwhile, will come more from better system design than from increases in transistors.

This would be more in line with the observation, back in the '80s, of former Intel executive David House, who said that performance doubles every 18 months. (Technically, performance doubles every 20 months, but House was close.)

Still, transistors will continue to shrink and computational power will continue to increase, regardless of predictions of stasis. As VLSI Research CEO Dan Hutcheson points out, "It's good enough," is what the clay-tablet makers said about their products when papyrus came out.