X

Despite its aging design, the x86 is still in charge

With most of the world's software written with x86 in mind, it's doubtful that any future chip architecture would be able to displace it. Images: 35 years of Intel chip design

Tom Krazit Former Staff writer, CNET News
Tom Krazit writes about the ever-expanding world of Google, as the most prominent company on the Internet defends its search juggernaut while expanding into nearly anything it thinks possible. He has previously written about Apple, the traditional PC industry, and chip companies. E-mail Tom.
Tom Krazit
6 min read
Few computing technologies from the late 1970s endure today, with one notable exception: the fundamental marching orders for the vast majority of the world's computers.

The x86 instruction set architecture (ISA), used today in more than 90 percent of the world's PCs and servers, hit the marketplace in 1978 as part of Intel's 8086 chip.

So when the worldwide Intel developer's community gathers for its annual conference in Beijing later this month, they'll spend most of their time talking about technology that was developed when Jimmy Carter was in the White House and the soundtrack for the John Travolta movie Saturday Night Fever was the best-selling album in the United States.

Other instruction sets--which are basically, lists of operations that a software program can use--do exist, of course. There's IBM's Power, Sun Microsystems' Sparc and Intel's own EPIC (explicitly parallel instruction computing) Itanium project, to name a few. But x86 continues to thrive and has no serious competitors on the horizon because it provides "good enough" performance and because of the vast amount of software written over nearly three decades.

x86

"If you look at the history of computing, big moves happen because there is a dramatic new requirement or change in the marketplace," said a professor of computer science and engineering at the Massachusetts Institute of Technology who uses the single name Arvind.

But x86 is apparently an exception to the rule. Whether it's the invention of the browser or low-cost network computers that were supposed to make PCs go away, the engineers behind x86 find a way to make it adapt to the situation.

Is that a problem?

Critics say x86 is saddled with the burden of supporting outdated features and software, and that improvements in energy efficiency and software development have been sacrificed to its legacy.

A comedian would say it all depends on what you think about disco.

Humble beginnings
The x86 ISA made its debut with Intel's 8086 processor in 1978. Even at the time, it wasn't considered the most elegant implementation on the market because of the way it searched for memory addresses, said Dean McCarron, an analyst with Mercury Research. IBM chose a slightly different version--the 8088--for its new PC, and the x86 architecture started to gain traction.

"It was originally thought about as an eight-bit chip (Intel's and Advanced Micro Devices' current chips are 64-bits) designed to run spreadsheets," said Phil Hester, chief technology officer at AMD. Accordingly, the original design lacked support for, among other things, an appropriate number of general-purpose registers that would be needed for the modern computing era. Registers are essentially small holding stations for data as it awaits processing, and general-purpose registers are useful because they can store either data or an address where that data is stored.

"There's no reason whatsoever why the Intel architecture remains so complex."
--Simon Crosby
CTO, XenSource

As the number of people using PCs made by IBM and so-called clone manufacturers grew, the x86 became the irreplaceable heart of the PC market. In the mid-1990s, Intel's entry into the server market with x86 chips cemented the ISA's dominance. Today, more than 90 percent of all servers shipped in the world use an x86 processor from either Intel or AMD.

Intel and AMD have managed to keep x86 fresh by continually adding extensions to the ISA, such as Intel's MMX and SSE instructions in the mid-'90s that improved graphics performance, and AMD's 64-bit extensions this decade that helped bypass the register issue. "We have seen a huge amount of change at the instruction level; we just keep calling it the same thing," said Rick Rashid, a senior vice president at Microsoft in charge of that company's research division.

But with each generation of extensions to the x86 ISA, more and more complexity is added to the chips, and support for the older feature remains to guarantee software compatibility.

"There's no reason whatsoever why the Intel architecture remains so complex," said Simon Crosby, chief technology officer at virtualization software start-up XenSource. "There's no reason why they couldn't ditch 60 percent of the transistors on the chip, most of which are for legacy modes."

If a chipmaker declared its chip could run only software written past some date such as 1990 or 1995, you would see a dramatic decrease in cost and power consumption, Crosby said. The problem is that deep inside Windows is code taken from the MS-DOS operating system of the early 1980s, and that code looks for certain instructions when it boots.

This was part of the motivation behind Intel and Hewlett-Packard's EPIC project: a "clean-sheet" design that would remove many of x86's idiosyncrasies and support for legacy technologies, providing a modern foundation for the next 20 years.

Instead, EPIC became a lesson in how not to introduce a new instruction set. Software developers shied away from having to learn a new computing language, and early roll-out problems hindered Intel and HP's chances of building a broad market for the processor. The warm embrace of AMD's Opteron x86-64 processor (later duplicated by Intel) was the final blow, relegating EPIC and Itanium to the high end of the server market where it makes sense to port applications to take advantage of the performance offered by Itanium.

As with most things, it all came down to money. Billions of dollars have been invested in software written for x86. Even Intel--one of the most influential companies in the technology industry--couldn't convince software developers to move away from all those investments.

Is there an alternative?
Last year, Intel Chief Technology Officer Justin Rattner said the company had no plans to develop a new ISA in the foreseeable future. Microsoft's Rashid said his group doesn't have any projects that involve a rival instruction set, although Microsoft supported several different instruction sets as recently as 1999 with Windows NT 4.0.

So what might change the game? Performance is always one way to make software developers sit up and take notice, but there's nothing dramatic on the horizon. It's unlikely that any so-called "clean sheet" design would be able to produce more than a 10 percent improvement in performance or power consumption over the modern x86 ISA, Hester said.

A performance improvement that small isn't going to encourage a dramatic move away from x86, said Pat Gelsinger, a veteran chip designer and senior vice president and general manager of Intel's Digital Enterprise Group. "We're delivering 2x performance gains every year" with existing designs that can still run older applications.

The chip industry's ability to continue packing transistors onto its processors means that it dedicates fewer and fewer transistors--out of the whole--to keeping legacy code alive. "The burden of compatibility is there," Gelsinger said. "But the value of compatibility overwhelms the cost it brings with it."

One technology improvement that could be a wild card in the mix is the introduction of new chips with two or more processing cores. Chipmakers have settled on building chips with several lower-speed processor cores as a way of getting around power consumption problems caused by a single high-speed core. Right now, however, each core needs to use the same instruction set.

Some think a hybrid future is possible: smaller, more power-efficient cores could be created on an x86 using other ISAs that would be dedicated for specific tasks, like video processing, Arvind said.

IBM is doing something like this with its Cell processor design, found at the heart of Sony's PlayStation 3. Cell uses one PowerPC core in a sort of supervisory role over eight separate processing units. Further on down the road, chip companies could keep a basic x86 core to maintain backward compatibility and handle the next generation of complicated processing tasks with dedicated hardware--that may or may not run x86.

The earliest parts of this transition can be seen in efforts such as AMD's Fusion project, in which it plans to integrate a graphics processor onto a PC processor, McCarron said. By the next decade, processors with a mixture of cores using different ISAs could become a reality, he said.

But don't count on it.

"What has worked in (x86's) favor is that it's an evolutionary architecture, when problems come up it gets adapted," McCarron said. "This is ultimately the one that got picked. And for everything to work with each other, that's what we stick to."

CNET News.com's Stephen Shankland contributed to this report.