larrabee

Intel to lay out supercomputing chip plans

Intel on Tuesday provided more color to its plans for supercomputing chips that would eventually compete with offerings from Nvidia. Intel said it will provide further details next week at a supercomputing conference.

In the wake of Intel's cancelation of the "Larrabee" graphics chip project in December of last year, Intel is now focusing on an analogous project targeted at supercomputers, a market that is generally referred to as high-performance computing or HPC.

"We are...executing on a business opportunity derived from the Larrabee program and Intel research in many-core chips," Bill Kircos, an Intel … Read more

Ghosts of projects past haunt Intel graphics chip

Intel's checkered past on some large projects means the chipmaker must prove that Larrabee isn't a development flub that will simply be kept on life support for the next few years.

As reported last week, Intel's Larrabee graphics chip was killed as an initial product offering after protracted delays, demonstrating that as successful as Intel is, it's not immune to major product missteps.

If certain product histories are any indication, the challenge could be daunting. Intel's XScale processor for small devices--which was used in Compaq handhelds back in 2000--was sold off to Marvell in 2006 after an unsuccessful run. And its Itanium processor has been the object of perennial ridicule as a product hanging on for dear life after getting off to a very rocky start back in 1998. Sun Microsystem's former CEO Scott McNealy eventually dubbed the chip the "Itanic" as a play on the word Titanic.

"If you go back in history when they started down the Itanium path in the mid-90s, they said they were going to have a really whiz-bang product, but by the time they finally got it out, it was decidedly ho-hum or even worse," said Nathan Brookwood, principal analyst at Insight 64.

"They've learned that you don't ship a product the first time around so that when it finally does appear people go 'What was all the fuss about?'" he said.

Larrabee, as we know now, was not ready for prime time. "It was for all intents [and] purposes an Intel project--a test bed, some might say a paper tiger," said Jon Peddie, president of Jon Peddie Research, writing in a blog.

For better or worse, Intel is expected to forge ahead with Larrabee, with a real product not appearing until… Read more

Buzz Out Loud Podcast 1120: Make a lot of nickels, Microsoft

Microsoft cancels its family licensing program and Molly decides it needs a lesson in economics. Stop focusing on dimes, Microsoft! We also plea for some common sense in the case of the woman jailed for recording some of the new "Twilight" movie at a birthday party.

Subscribe with iTunes (audio) Subscribe with iTunes (video) Subscribe with RSS (audio) Subscribe with RSS (video) EPISODE 1120

Apple buys Lala service http://arstechnica.com/apple/news/2009/12/apple-buys-music-streamer-lala-but-whats-it-getting.ars?utm_source=rss&utm_medium=rss&utm_campaign=rss http://news.cnet.com/8301-31001_3-10410206-261.html http://www.appleinsider.com/articles/09/12/07/apples_lala_purchase_could_bring_browser_access_to_itunes_content.htmlRead more

Intel: Initial Larrabee graphics chip canceled

Intel said Friday that its Larrabee graphics processor will initially appear as a software development platform only.

This is a blow to the world's largest chipmaker, which was looking to launch its first discrete (standalone) graphics chip in more than a decade.

"Larrabee silicon and software development are behind where we hoped to be at this point in the project," Intel spokesman Nick Knupffer said Friday. "As a result, our first Larrabee product will not be launched as a standalone discrete graphics product," he said.

"Rather, it will be used as a software development … Read more

ATI and Nvidia face off--obliquely

Nvidia and Advanced Micro Devices' ATI division are taking different approaches to graphics processing in the next generations of their products. Both strategies have strengths and weaknesses, and I think it's too soon to pick the eventual winner in this long-running fight.

Before I get into my analysis, I should say that Nvidia paid me to write a white paper on the implications of its new GPU architecture (code-named Fermi) for high-performance computing applications. The white paper was released as part of the Fermi launch event at Nvidia's GPU Technology Conference last week.

Nvidia also paid for white papers from two other well-known microprocessor analysts, Nathan Brookwood of Insight64 and my friend and former colleague Tom Halfhill of Microprocessor Report. UC Berkeley professor David Patterson wrote a fourth white paper, and Nvidia wrote one of its own. All of these works take a different approach to the subject; all are worth reading if you need to understand what Fermi is all about.

In short, I think the Fermi architecture has been more thoroughly white-papered than any graphics chip design in history. All five of these documents are available on the Fermi home page on Nvidia's Web site, and just in case that page is moved or changed, you're welcome to take advantage of my own mirror of my white paper.

I've spent much of the last several days reading these documents plus David Kanter's excellent article on Fermi over on his Real World Technologies site. David managed to get some details on Fermi that Nvidia didn't give to the rest of us.

I've also had time to go through the coverage of ATI's recent launch of the RV870, which is what Nvidia's Fermi-based chips will be competing against. The first of Nvidia's chips bears the internal code name of GF100, and it's huge. Here's a life-size photo:… Read more

Intel shows off Larrabee graphics chip for first time

SAN FRANCISCO--Heads up, Nvidia. Intel demonstrated its Larrabee graphics chip for the first time Tuesday at the Intel Developer Forum.

Larrabee will be Intel's first discrete, or standalone, graphics processor in about 10 years and is expected to compete with graphics chips from Nvidia and AMD's ATI unit. The demo used an early "stepping," or version, of Larrabee, which is expected to come out commercially sometime next year.

Larrabee will be targeted initially at the gaming market. The demonstration was based on the game Enemy Territory: Quake Wars from Splash Damage (See video.)

"This is … Read more

A new view of 3D graphics

Have we reached the end of the road for conventional 3D rendering?

Siggraph 2009 ended Friday, and I've spent the last few days digesting what I learned there. Although I've been involved in the graphics industry since 1990 and I've attended Siggraph most years since 1992, a crisis of sorts seems to have snuck up on me.

At the High Performance Graphics conference before the main show, keynote speeches from Larry Gritz of Sony Pictures Imageworks and Tim Sweeney of Epic Games showed that traditional 3D-rendering methods are being augmented and even supplanted by new techniques for motion-picture production as well as real-time computer games.

Gritz reckoned that 3D became a fully integrated element of the moviemaking process in 1989 when computer-generated characters first interacted with human characters in James Cameron's "The Abyss."

Gritz described how Imageworks has moved to a new ray-tracing rendering system called "Arnold" for several films currently in production, replacing the Reyes (Render Everything Your Eyes See) rendering system, probably the most widely used technology in the industry.

According to Gritz, Reyes rendering led to unmanageable complexity in the artistic component of the production process, outweighing the render-time advantages of the Reyes method. But Gritz says even these advantages diminished as the demand for higher quality drove Imageworks to make more use of ray tracing and a sophisticated lighting model called global illumination.

The bottom line for Imageworks is that Arnold, which was licensed from Marcos Fajardo of Solid Angle, takes longer to do the final rendering, but is easier on the artists and makes it easier to create the models and lighting effects--a net win.

Sweeney echoed this theme the next day, which surprised me considering Sweeney's focus is real-time rendering for 3D games--notably with Epic's Unreal Engine, which has been used in hundreds of 3D games on all the major platforms. Game rendering uses far less sophisticated techniques because each frame has to be rendered in perhaps one-sixtieth of a second, not the four or five hours on average that can be devoted to a single frame of a motion picture.

It seems that Sweeney is also… Read more

Hot days and Hot3D in New Orleans

Two companies--respectively (I believe) the smallest and largest makers of graphics chips--announced on Sunday that they are developing new standard APIs (application programming interfaces) specifically for ray-traced computer graphics.

Caustic Graphics introduced CausticGL, an API designed to leverage the best aspects of OpenGL, the most widely supported 3D API on the market. CausticGL ties in with Caustic's accelerator chips and boards, which the company says can deliver some 20X the ray-tracing performance of a conventional CPU.

Nvidia offered OptiX (pronounced like "optics"), a name designed to resonate with PhysX, the physics API acquired last year when Nvidia … Read more

GPUs and the new 'digital divide'

I spent Tuesday at Nvidia headquarters, attending the company's annual Analyst Day.

I've been to most of Nvidia's analyst events over the last decade or so, since I covered Nvidia almost from its inception while working as the graphics analyst at Microprocessor Report. These meetings are always a good way to get an update on the company's business operations, and sometimes--like this time--one provides exceptionally good insight into larger industry trends.

Nvidia has had a rough couple of quarters in the market, which CEO Jen-Hsun Huang blamed in part on a bad strategic call in early 2008: to place orders for large quantities of new chips to be delivered later in the year. When the recession hit, these orders turned into about six months of inventory, much of which simply couldn't be sold at the usual markup.

In response, Nvidia CFO David White outlined measures the company plans to take to increase revenue, sell a more valuable mix of products, reduce the cost of goods sold, and cut back on Nvidia's operating expenses.

Three things stood out for me in this presentation:

Nvidia is planning an aggressive transition to state-of-the-art ASIC fabrication technology at TSMC, the company's manufacturing partner. Within "two to three quarters," White said, about two-thirds of the chips Nvidia sells will be made using 40-nanometer process technology. (The first of these chips were announced Tuesday.)

White also acknowledged something that I've long assumed to be true: Nvidia receives "preferential allocation" on advanced process technology at TSMC. It's logical that Nvidia should get the red-carpet treatment, having been TSMC's best customer for many years, but I don't recall hearing Nvidia or TSMC put this fact on the record before.

The third notable point from White's presentation: the gross margins for Nvidia's Tegra, an ARM-based application processor--which Nvidia's Mike Rayfield, general manager of the Tegra division, says has already garnered 42 design wins at 27 companies--are much higher than I'd have guessed--at "over 45 percent." That's quite excellent for an ARM-based SoC; it's a very competitive market.

More surprises The technical sessions at the event contained their own surprises.

For example, Nvidia effectively seized control of an old Intel marketing buzzword: "balanced."

For years, Intel used to talk about… Read more

Intel creates European visual-computing center

Intel said Tuesday that it is investing $12 million in a visual-computing research center in Europe. This comes as Intel prepares to bring out its first graphics chip in more than a decade by early next year.

Opening Tuesday, the Intel Visual Computing Institute is located at Saarland University in Saarbrücken, Germany. The company says the center "will explore advanced graphics and visual computing technologies."

The investment, to be made over five years, represents Intel's largest European university collaboration, the company said.

"Intel's visual computing vision is to realize computer applications that look … Read more