X

Nvidia in the throes of remaking itself

The company hopes to parlay the power of its Fermi chip design into the mainstream, but challenges from Intel and AMD make the here and now for Nvidia uncertain.

Brooke Crothers Former CNET contributor
Brooke Crothers writes about mobile computer systems, including laptops, tablets, smartphones: how they define the computing experience and the hardware that makes them tick. He has served as an editor at large at CNET News and a contributing reporter to The New York Times' Bits and Technology sections. His interest in things small began when living in Tokyo in a very small apartment for a very long time.
Brooke Crothers
5 min read

Updated at 4:40 p.m. PDT: adding to discussion of next-generation Nvidia Ion chip.

As graphics kingpin Nvidia tries to reshape itself into a broad-based computing company, it is taking big gambles with potentially big payoffs, while it fends off challenges from rivals Intel and Advanced Micro Devices.

The world's largest supplier of standalone graphics chips for PCs needs to grow. Established markets have matured and Nvidia must seek out other ways to make money.

GF100 graphics processor
Nvidia's Fermi-based GF100 graphics processor.

"In almost every market they have entered they have become dominant," said Jon Peddie, president of Jon Peddie Research, which tracks the graphics chip market. "Almost 90 percent market share in the workstation business and 55 to 65 percent in the graphics business. But if you're that successful you can't really grow the market anymore, and if you want to keep growing your company, then you have to get into new markets."

Enter supercomputing and Nvidia's brand-new Fermi architecture. "That's a huge market and big margins," said Peddie. Fermi was announced last week at an Nvidia conference to great fanfare when prestigious Oak Ridge National Laboratory said it plans to use Fermi in a future supercomputer.

It would be an understatement to say that the Fermi chip potentially packs a computing wallop. The chip integrates an astounding 3 billion transistors, about three times the number of transistors in Nvidia's most powerful graphics chip now on the market, and it has been designed with features that make it more suitable for high-performance computers, the first time that Nvidia has architected a chip this way.

Fermi GPUs, each containing 512 processing cores, would enable "substantial scientific breakthroughs" that would be impossible without the new technology, Jeff Nichols, Oak Ridge's associate lab director for computing and computational sciences, said last week.

Nvidia hopes to parlay this computing power into the mainstream. (For a comparison of Fermi with AMD's newest graphics chip see: ATI and Nvidia face off--obliquely.)

"Fermi will offer Nvidia the opportunity to grow our consumer business by having the fastest raw graphics power," said Drew Henry, general manager of Nvidia's bread-and-butter GeForce graphics business. "But it's also going to expand our business by allowing people to process better video and photo applications and to use the GPU for many, many more mainstream applications." (GPU stands for graphics processing unit.)

Henry is referring to a technology called GPU Compute, which takes advantage of new features in Windows 7 and Apple's Snow Leopard operating systems that turn the graphics chip into a general-purpose compute engine that accelerates, sometimes many times over, everyday consumer applications--not just games.

"Windows 7 elevates the GPU to be a co-processor along with the (Intel chip)," Henry said. He pointed to the examples of technology from Cyberlink that does "face tagging," allowing users to sort through the gigabytes of photos they may have on their PCs to identify common faces, and from VReveal, which helps users clean up and enhance video taken with a cell phone.

Legal scuffles and graphics rivals
While Fermi holds great promise, the here and now for Nvidia is uncertain. Due to legal challenges from Intel, Nvidia will no longer develop Intel-compatible chipsets, a lucrative business. This imbroglio can be summed up in three words: Direct Media Interface, a chip interconnect technology that Intel is using in its newest crop of Core "i" series processors--namely, the i3, 15, and i7--which Nvidia is barred from accessing.

"We are not going to develop a DMI chipset for Intel. We're not investing in that area," said Henry, adding that Nvidia will continue to make chipsets for AMD processors.

Nvidia redesigned its next 'Ion' chip to thwart Intel.
Nvidia redesigned its next 'Ion' chip to thwart Intel.

And this legal scuffle is whacking Nvidia's successful Ion chipset line--used in everything from Apple Macbooks to Hewlett-Packard Netbooks. But Nvidia will push on, bringing out a second generation of Ion silicon that will be redesigned to work with Intel's future Atom Netbook technology and future notebooks.

"While we won't build an integrated chipset, we have wonderful solution for providing our GPU technology. It will have substantially more performance than our current generation," Henry said. "We have a different strategy for connecting GPUs."

Intel's upcoming Atom silicon is a departure for Intel because it puts the graphics function onto the same piece of silicon as the main processor--an Intel first.

Henry said the new Ion technology will be legal-proofed from future Intel challenges. "Intel won't be able to block this particular product so consumers will have access to Nvidia technology," he said.

Intel, however, may not be Nvidia's biggest immediate concern. AMD, via its ATI graphics chip unit, has just announced a bevy of high-performance, well-received graphics cards that have knocked Nvidia back on its heels.

"The problem is that ATI has beaten Nvidia to market with new gaming boards, and Nvidia is going to miss the window for the new holiday season with (no) new parts," said Peddie. "And these (ATI products) are very high-performance boards. And they're going to be DirectX 11 compatible for Windows 7. And Nvidia just doesn't have that right now." (The Windows 7 DirectX 11 technology accelerates gaming and other features in Microsoft's new operating system.)

Nvidia's Fermi, when it arrives in a few months, should close this competitive gap quickly. "Once Fermi comes out as a gaming board, you'll hear a different tune from Nvidia," said Peddie, who believes that Fermi also ups the ante for Intel when Intel brings out its long-awaited "Larrabee" graphics chip in 2010.

Tegra: Downsizing into smartphones
Nvidia's Tegra chip, on the other hand, is already here and is already being used in Microsoft's Zune HD and Samsung's M1 media players. Tegra is a radical departure from Nvidia's past of big, power-hungry graphics chips: it is a tiny system-on-a-chip that squeezes a total of eight independent processors, including a GeForce graphics processor, onto one piece of silicon.

Nvidia CEO Jen-Hsun Huang has not been bashful about his expectations for Tegra, saying back in June that Tegra in a few years may represent half of Nvidia's business, with the rest divided up between the Quadro workstation, Telsa supercomputing, and GeForce consumer lines.

Tegra could have a watershed year in 2010 as it finds its way into smartphones. "They've got 70 design wins. That business is going to show growth in 2010," said Peddie.

Next year will also be a pivotal period for Nvidia overall. Don't underestimate the company, says Peddie, writing in a blog post this week. "One of Jen Hsun Huang's key strengths is his long range vision...I was there 16 years ago when he had the vision for the gaming market, and I must confess I thought he was reaching a bit far but he systematically built for his vision and history has proved him right; and a lot of people have made money because of it."