Intel chipset delay shows the devil's in the details
Intel has delayed first customer shipment of its Montevina core logic. Glaskowsky explains what could be behind the delay, and what this implies for the future of Intel's graphics technology.
As has been widely reported (for example, by EDN Magazine and both and here at CNET), Intel has delayed the first customer shipments (FCS) of its "Montevina" chipsets, part of the new Centrino 2 platform.
The delays are pretty short, however... a matter of just a few weeks.
Intel attributes the delays to two independent problems: one with FCC certification of the 802.11n WiFi feature in the chips (just "paperwork," Intel says), and one with the integrated graphics engines in some models.
Intel's probably right about the WiFi certification problem. I've been through the FCC certification process (for electromagnetic interference (EMI), at least); there sure is a lot of paperwork involved.
For the graphics problem, I see a couple of possible explanations.
Intel could have discovered a design flaw in the first production units severe enough to prevent them from being shipped, which would have caused a substantial delay while a new run of production units was completed. (See my earlier blog post, "", for an explanation of how design flaws are related to product defects and faults.) This delay would have been largely hidden by the usual rounds of testing, but perhaps it just used up a little more time than the slack that was available in the schedule.
Or perhaps there was a design or manufacturing flaw that didn't require trashing the first production run, but which did require some additional testing and qualification to reject specific problematic parts. This could be caused by slower or hotter operation than expected, for example. Such a problem would cause a shorter delay-- just the extra testing time. A statement from Intel in the Crothers post referring to "re-screening" suggests this is the situation here, although potentially that statement could also describe testing a second production run to ensure the problem has been solved.
I find it interesting that this problem is related to Intel's new graphics engine, which is certainly the most important element of the new chipset. Intel's previous integrated graphics products have been criticized for not really being up to the challenges of running Windows Vista, including by Microsoft itself, but , Microsoft certified these chips as "Vista Capable." That's technically true-- I've used integrated-graphics platforms under Vista myself-- but the resulting shortfalls in performance and features probably discouraged many new Vista users.
Graphics engines are very complicated, and getting more complicated every year. Intel started out well enough in the graphics business when it worked with Real3D (now defunct) to develop the Intel740, a discrete graphics chip, but 18 months later it found itself already 18 months behind ATI and NVIDIA, and fell back to selling only integrated-graphics chipsets, where the graphics component is worth only a few dollars in incremental revenue.
Intel plans to get back into the market for discrete graphics chips in 2009 or (more likely) 2010 with "", a multi-core CPU in which some cores are optimized for graphics processing. I think Larrabee will turn out to be a technical disaster, but Intel has leveraged its market domination to turn previous technical disasters into financial windfalls. Think of the Pentium 4's "Hyper-Pipelined" design, for example, which was too hot and too inefficient, ultimately forcing Intel to bring its predecessor, the P6 design, back from the grave several years later. Intel's current graphics engines, however, are barely worth selling today, and they won't be worth reviving after Larrabee has run its course.