X

A 'post-x86 world'? Preposterous!

GigaOM makes some bizarre assertions. Intel and the x86 architecture are not seriously threatened by ARM processors and new kinds of computing devices.

Peter Glaskowsky
Peter N. Glaskowsky is a computer architect in Silicon Valley and a technology analyst for the Envisioneering Group. He has designed chip- and board-level products in the defense and computer industries, managed design teams, and served as editor in chief of the industry newsletter "Microprocessor Report." He is a member of the CNET Blog Network and is not an employee of CNET. Disclosure.
Peter Glaskowsky
4 min read

I honestly don't know whether Om Malik's blog site, GigaOM, is intended to be informative or merely entertaining. I pointed out a previous example of the overwrought rhetoric that permeates that site last September (in the context of Comcast's then-new usage cap policy), but generally, I try to ignore the nonsense there for the same reasons that I ignore talk radio.

But like it or not, GigaOM is widely read, and sometimes when a post there bears directly on a market that's important to me, I can't bear to let it go. This is one of those times.

On Thursday, a GigaOM staffer wrote a piece titled "Can Intel Thrive in a Post x86 World?"

A slide from Fred Weber's keynote presentation at Microprocessor Forum 2003
A slide from Fred Weber's keynote presentation at Microprocessor Forum 2003 showing how x86 will evolve into systems from big servers down to handheld consumer devices. Advanced Micro Devices, Inc.

The headline is preposterous from beginning to end. It has two implications just in the eight words of the title: that Intel's ability to "thrive" faces any imminent threats, and that the importance of the x86 architecture is declining.

In January, the same staffer wrote a piece titled "Netbooks and the Death of x86 Computing" which reached the fantastic conclusion that Netbooks would "destroy the hegemony of x86 machines for personal computing."

Well, as I pointed out just a few weeks later (in "The Netbook is dead. Long live the notebook!"), when the Netbook phenomenon ran up against the dominance of Intel and Microsoft in the PC market, it was the Netbook that died instead. Even at a $300 price point, people still want full PC compatibility.

Yes, there are companies like Freescale (the subject of the January post on GigaOM) and Nvidia that are looking to push the ARM architecture into the Netbook space. But that idea never made much sense, and now that Intel and TSMC are working together to get Intel's Atom x86 core into lower-cost SoC (system on chip) products, the ARM architecture will eventually have to retreat into the shrinking niche for supersmall, supercheap phones and consumer electronics gizmos for which x86 compatibility is of negligible value.

See, we learned a long time ago--those of us who cover this industry professionally, not just as a random assignment for some random blog--that the instruction set architecture (ISA), per se, doesn't matter any more.

The choice of ISA was a big deal in the 1980s and early 1990s, when the extra complexity of an x86 instruction decoder was a large fraction of the total complexity of a microprocessor. That's where the conflict between RISC and CISC came from.

But by the turn of the century, ISA complexity was almost a dead issue, and that coffin's final nail was pounded in by the keynote speech of then-Advanced Micro Devices CTO Fred Weber at Microprocessor Forum 2003, an event I had the honor of hosting.

In his talk, "Towards Instruction Set Consolidation," Weber made a simple point: "Technology has passed the point where instruction set costs are at all relevant."

Even then, three generations of process technology ago, the "x86 penalty" was down to a couple square millimeters of silicon. Today, the comparable figure is about 0.25 square millimeters. Not zero, certainly, but not a significant concern for chips that are a hundred times larger.

In short, ARM chips aren't cheaper or more power-efficient because of their instruction sets; they're like that because they're designed to be. And anything that an ARM chip can do to save cost or power can also be done by an x86 chip.

So there can't ever be a time when the world moves beyond x86. That's 1980s thinking, just plain ignorance of what may be the most important trend in the microprocessor industry.

The rest of Thursday's GigaOM post is a hopelessly self-contradictory muddle that fails to reach any clear conclusions. I'll just quote one more line near the end: "But the PC will be just one small (and shrinking) battleground to keep x86 relevant, amid a more mobile, visual, and power-sensitive world."

Current economic woes aside, the PC market is hardly shrinking. You know what's shrinking? The PC! As the PC shrinks, the PC market will grow. The MID (mobile Internet device) market isn't much to speak of right now, for example, but once MID makers figure out what to build, MIDs will become more popular.

And seriously, is anyone really not clear on the fact that the Apple iPhone is a computer? It isn't an embedded system. An embedded system is one in which the presence of a microprocessor is functionally irrelevant to the user. When a gizmo exposes its programmability to the user, it's a computer.

What else is the App Store but the visible manifestation of the iPhone's programmability?

Now, ARM isn't dead yet. The iPhone uses an ARM processor because there's no x86 processor that would work as well in that system. ARM processors will probably see at least two more generations in cell phones just because there's so much ARM-based software out there (including all the software on the App Store).

But somewhere around 2012, we're going to see x86 chips poking into that space. The value of instruction set compatibility with the PC market will persuade developers of new cell phone platforms to go with x86 chips, and eventually even established systems like the iPhone will switch over.

So not only are x86 chips selling into a growing PC market, they'll eventually start eating into ARM's own strongholds. That can't be bad for Intel.

And that's why the GigaOM piece was preposterous.