Moore's Law is the reason your iPhone is so thin and cheap
Intel co-founder Gordon Moore's observation 50 years ago set the groundwork for self-driving cars on the road and computers in our pockets today.
Roger ChengFormer Executive Editor / Head of News
Roger Cheng (he/him/his) was the executive editor in charge of CNET News, managing everything from daily breaking news to in-depth investigative packages. Prior to this, he was on the telecommunications beat and wrote for Dow Jones Newswires and The Wall Street Journal for nearly a decade and got his start writing and laying out pages at a local paper in Southern California. He's a devoted Trojan alum and thinks sleep is the perfect -- if unattainable -- hobby for a parent.
ExpertiseMobile, 5G, Big Tech, Social MediaCredentials
SABEW Best in Business 2011 Award for Breaking News Coverage, Eddie Award in 2020 for 5G coverage, runner-up National Arts & Entertainment Journalism Award for culture analysis.
To get a sense of what society owes to Moore's Law, just ask what the world would look like if Intel co-founder Gordon Moore never made his famous 1965 observation that the processing power of computers would increase exponentially.
"It is almost unimaginable," said Genevieve Bell, a cultural anthropologist for Intel.
"The implications would be so dramatic, I struggle to put it in words," said Adrian Valenzuela, marketing director for processors for Texas Instruments.
Jeff Bokor, a professor of electrical engineering and computer science at the University of California, Berkeley, found at least one: "Cataclysmic."
The comments aren't wild hyperbole; they underscore just how significant an impact one little observation has had on the world. Moore's Law is more than a guideline for computer processor, or chip, manufacturing. It's become a shorthand definition for innovation at regular intervals, and has become a self-fulfilling prophecy driving the tech industry.
Are you happy about your sleeker iPhone 6 or cheaper Chromebook? You can thank Moore's Law.
But first, let's explore the effect of Moore's Law throughout history -- and start by dispelling some misconceptions. Most importantly, Moore's Law is not actually a law like Isaac Newton's Three Laws of Motion. In a paper titled, "Cramming More Components onto Integrated Circuits," published by the trade journal Electronics in 1965, Moore, who studied chemistry and physics, predicted that the number of components in an integrated circuit -- the brains of a computer -- would double every year, boosting performance.
A decade later, he slowed his prediction to a doubling of components every two years.
It wasn't until Carver Mead, a professor at the California Institute of Technology who worked with Moore at the Institute of Electrical and Electronics Engineers, coined the term "Moore's Law" in 1975 that it gained widespread recognition in the tech world. It became a goal for an entire industry to aspire to -- and hit -- for five decades.
"[It's] a name that has stuck beyond anything that I think could have been anticipated," Moore, now 86, said in an interview with Intel earlier this year.
Watch this: The hefty price of keeping up with Moore's Law
A self-fulfilling prophecy
Moore's Law specifically refers to transistors, which switch electrical signals on and off so that devices can process information and perform tasks. They serve as the building blocks for the brains inside all our smartphones, tablets and digital gadgets.
The more transistors on a chip, the faster that chip processes information.
To keep Moore's Law going, chip manufacturers have to keep shrinking the size of the transistors so more can be placed together with each subsequent generation of the technology. The original size of a transistor was half an inch long. Today's newest chips contain transistors that are smaller than a virus, an almost unimaginably small scale. Chipmakers including Intel and Samsung are pushing to shrink them even more.
But size doesn't really matter when it comes to appreciating Moore's Law. More important is the broader idea that things get better -- smarter -- over time.
The law has resulted in dramatic increases in performance in smaller packages. The Texas Instruments processor that powers the navigation system in a modern Ford vehicle is nearly 1.8 million times more powerful than the Launch Vehicle Digital Computer that helped astronauts navigate their way to the moon in 1969.
And Apple's iPhone 6 is roughly 1 million times more powerful than an IBM computer from 1975 -- which took up an entire room -- according to a rough estimate by UC Berkeley's Bokor. The iPhone, priced starting at $650, is also a lot cheaper than a full-fledged desktop computer selling anywhere between $1,000 and $4,000 a decade ago -- and it can do so much more.
Just as critical is the time element of Moore's Law: the doubling of transistors every two years meant the entire tech industry -- from consumer electronics manufacturers to companies that make the equipment to manufacture chips and everything in between -- had a consistent rate that everyone could work at.
"It created a metronome," Bell said. "It's given us this incredible notion of constant progress that is constantly changing."
It also set a pace that companies need to keep, or else get left behind, according to Moore. "Rather than become something that chronicled the progress of the industry, Moore's Law became something that drove it," Moore said in an online interview with semiconductor industry supplier ASML in December.
While he didn't think his observation would hold true forever, chipmakers don't seem to be slowing down their efforts. "It's a self-fulfilling prophecy, so to the industry it seems like a law," said Tsu-Jae King Liu, a professor of microelectronics at UC Berkeley.
Life without Moore's Law
Nowadays, everyone assumes technology will just get better, faster and cheaper. If we don't have a sophisticated enough processor to power a self-driving car now, a faster one will emerge in a year or two.
Remove Moore's Law, and that assumption no longer holds true. Without a unifying observation to propel the industry forward, the state of integrated circuits and components might be decades behind.
"It's an exponential curve, and we would be much earlier on that curve," Valenzuela said. "I'm happy to say I don't have to carry my 1980s Zack Morris phone."
Intel's Bell imagines a more "horrifying" world without integrated circuits, one in which everything is mechanized, and common tropes of technology such as smartphones and even modern telephone service wouldn't exist. "The Internet would have been impossible," she said.
Visiting the places that make Moore's Law happen (pictures)
It's not a completely implausible alternate reality. Bell noted that many industries haven't moved as quickly to embrace new technology and ideas. The internal combustion engine hasn't changed much since Henry Ford's Model T more than a century ago, and it's only in the last several years that automakers have embraced batteries that power the engine.
Speaking of batteries, there's a reason why our smartphones lose their juice faster and faster -- battery technology hasn't kept pace with the advancement of the processor and its capabilities.
"Not too many industries have a clearly defined expectation in improvement of capability and cost benefits over such a long time," said H.S. Philip Wong, an engineering professor at Stanford.
The future's so bright
It's a lot easier to document the progress achieved through Moore's Law. Increasingly sophisticated chips have resulted in not just more powerful standalone devices, but an ecosystem of gadgets that can talk to each other.
As Bell said, there would be no Internet without Moore's Law, which means Google or Facebook would never have existed, and Netflix would still be mailing DVDs (VHS tapes?) to you.
"It's a technology that's been much more open-ended than I would have thought in 1965 or 1975," Moore said. "And it's not obvious yet when it will come to the end."
Smaller processors have driven interest in the Internet of Things (IoT), or the idea that physical objects around us can be connected to the Internet and to each other. TI's Valenzuela said he remembers selling basic thermostats using rudimentary chips. Now smart thermostats built by Google's Nest have a processor powerful enough to run a smartphone.
Intel demonstrated the potential for the IoT idea in January at the Consumer Electronics Show with Curie, a button-size module designed to power smart wearable devices with a low-power processor. It's the reason why we're talking about self-driving cars, smart transportation systems, smart homes, smart watches and even clothes equipped with Internet-connected sensors.
"It's really like the water that we drink and air that we breathe," Wong said about society's dependence on the innovations brought on by Moore's Law. "We can't survive without it."