X

Itanium's rocky road to stardom

CNET News.com's Michael Kanellos writes that little has gone right for Intel's new chip line. Expectations still remain high, but will they ever get fulfilled?

Michael Kanellos Staff Writer, CNET News.com
Michael Kanellos is editor at large at CNET News.com, where he covers hardware, research and development, start-ups and the tech industry overseas.
Michael Kanellos
4 min read
Intel's Itanium processor is a brilliant piece of engineering. There's also a remote possibility that it could wind up as the world's next Dymaxion.

The Dymaxion was a three-wheeled car invented by R. Buckminster Fuller, the technological visionary who also promoted the geodesic dome and sixties-style furniture. The Dymaxion could turn tighter than conventional cars, had a rear engine and a steering tail fin. It crashed on its public debut in 1933.

In a similar vein, Itanium stands on the cusp of greatness, which, unfortunately, is located near the banana peel of disaster. The chip features an architecture that's entirely different from that found in current Intel processors. It gets rid of old clutter, and proponents tout its high performance at low cost. But are PC makers and corporate customers ready to switch?

Itanium, a 64-bit processor co-designed by Intel and Hewlett-Packard, will run in servers and supercomputers containing anywhere from four to over 1,000 processors. Ideally, the chip will provide performance increases and bring PC-like economics to a market where prices typically start at $100,000.

Despite massive investments in R&D and industry evangelism by Intel, however, all is not working according to plan. Originally due in the mid- to late 90s, a series of delays pushed the release of the first version of the chip, code-named Merced, to May 2001. And partly because of the delays and partly because of performance, Merced became a test vehicle for McKinley, coming out over the first half of 2002.

While McKinley is expected to outperform and outsell the first version of Itanium, Intel faces another problem: inertia. Few applications exist for the chip and computer makers say demand will take off only gradually.

The issue centers on the chip's architecture. Intel chips have been based around the X86 instruction set, which is essentially the language of a chip. But Itanium uses the EPIC instruction set. The only two companies that have been able to successfully switch instruction sets are IBM, with the Power chip, and Digital, with the fading Alpha. For that reason, most companies merely continue to enhance their first versions.

Intel also experienced these sorts of difficulties before.

In the early 1980's, the company's top designers were working on a 32-bit successor to the 16-bit 286 chip, code-named Sierra. Sierra would have featured an entirely new instruction set. The "B" team was given the assignment to come out with a 32-bit chip with the X86 instruction set but Sierra never saw the light of day. Pat Gelsinger, leader of that B team, is now the company's chief technology officer.

What might Intel do in this situation? How about coming out with a 64-bit chip that used the X86 instruction set? Such a chip would be compatible with existing software, and therefore easy for manufacturers and customers to adopt.

It would also be easy to pull off. In fact, early on, Intel contemplated this option, according to sources. Such a project would also effectively kill the appeal of Hammer, a 64-bit chip from Advanced Micro Devices that uses the X86 instruction set.

"My biggest fear is that Intel will come out with a 32-bit processor with 64-bit extensions because it is the right thing to do," said AMD CEO Jerry Sanders in November. "The Itanium, it turns out, is a niche product. We are going to have a role in the industry because we better fulfill Microsoft's needs."

Some quarters within the industry are intrigued by the two-way idea. But while it's technologically feasible, Intel would find a move in this direction to be financially and intellectually gut wrenching. It would also end so many careers inside Intel that it's difficult to envision this ever coming to pass.

Difficult, but not impossible. In 1996, the company declared that future chips would be paired with memory based on designs from Rambus. PC makers, analysts and others were convinced that Intel's steadfast resolve would ensure that Rambus would become the gold standard for memory.

Memory makers, however, began to complain about the high costs of investing in manufacturing capabilities for Rambus. Prices failed to drop as expected, and delays piled up. DDR DRAM, an alternative almost given up for dead in 1998 and 1999, suddenly became a lot more appealing. By July 2000, Intel declared it would pair its chips with other memory, thus pushing Rambus to the fringes.

As for the future?

For the record, an Intel representative denied that any project is under way for a 64-bit X86 chip and that the company stands 100 percent behind Itanium.

"I expect at some point in the future they will do sort of expansion on the 32-bit," said microprocessor analyst Kevin Krewell. But it won't be an easy decision, not after Intel's investment of hundreds of millions into the program. "They are going to keep pushing that rock up the hill until they can't. Then they will fall back. "