X

Report: IBM researcher says Moore's Law at end

IBM Fellow Carl Anderson says at a conference this week that Moore's Law is hitting a ceiling, according to a report.

Brooke Crothers Former CNET contributor
Brooke Crothers writes about mobile computer systems, including laptops, tablets, smartphones: how they define the computing experience and the hardware that makes them tick. He has served as an editor at large at CNET News and a contributing reporter to The New York Times' Bits and Technology sections. His interest in things small began when living in Tokyo in a very small apartment for a very long time.
Brooke Crothers
2 min read

Moore's Law is maxing out. This is an oft-made prediction in the computer industry. The latest to chime in is an IBM fellow, according to a report.

Intel co-founder Gordon Moore predicted in 1965 that the number of transistors on a microprocessor would double approximately every two years--a prediction that has proved to be remarkably resilient. But IBM Fellow Carl Anderson, who researches server computer design at IBM, claims the end of the era of Moore's Law is nigh, according to a report in EE Times.

Exponential growth in every industry eventually has to come to an end, according Anderson, who cited railroads and speed increases in the aircraft industry, the report said.

"A generation or two of continued exponential growth will likely continue only for leading-edge chips such as multicore microprocessors, but more designers are finding that everyday applications do not require the latest physical designs," Anderson said in the EE Times' report. Anderson also cited the staggering costs of research and fabs (factories) as a formidable barrier for continued advancement. Few companies can afford chip plants that typically cost billions of dollars to build and maintain.

So, what does the future hold? Anderson cited three technologies: optical interconnects, 3D chips--which have circuits and components stacked on top of each other--and accelerator-based processing as seeing significant advancements, the report said. The latter technology, accelerators, is hot right now.

In addition to IBM, companies such as Nvidia and Advanced Micro Devices' ATI unit supply graphics-processor-based computers to accelerate scientific, engineering, and animation applications. Intel is also expected to bring out its Larrabee chip later this year or early next year that can be used as an accelerator.