Here's another warning that the end is within view for Moore's Law, one of the backbone theories of the computing industry.
Theoretical physicist Michio Kaku predicts that in "10 years or so we will see the collapse of Moore's Law. In fact, already, we are seeing a slowing down of Moore's Law. Computing power simply cannot maintain its rapid exponential rise using standard silicon technology."
Intel: Moore's Law resource guide
In a presentation Kaku made earlier in the month, Kaku noted that a Pentium chip today "has the layer almost down to 20 atoms across...when that layer gets down to 5 atoms across, it's all over. You have two effects. The heat generated will be so intense that the chip will melt."
The other problem he sees: leakage.
You don't know where the electron is anymore. The quantum theory takes over. The Heisenberg uncertainty principle says you don't know where that electron is anymore meaning it could be outside the wire, outside the Pentium chip, or inside the Pentium chip. So there is an ultimate limit set by the laws of thermodynamics and set by the laws of quantum mechanics as to how much computing power you can do with silicon."
In the near term, Kaku expects the industry will try and tweak current and future processor generations with known technology, squeezing out what it can out from what passes as current state-of-the-art. But he expects the industry ultimately will have no choice but to embrace new approaches, such as molecular computers or quantum computers.
For the record, though, Intel says it's not worried. "It's not that it's impossible. It's that it's increasingly more challenging to do it," an Intel spokesman said. "Gordon Moore himself has said that eventually Moore's law will run out 'but every time i turn around, I'm fascinated how we've been able to extend it.'"
Hat tip to Carl Burns of SlashGear.