X

Apple eyeing Nvidia's CUDA technology?

CUDA makes it easier for developers to exploit Nvidia's graphics chips, and Apple might be getting ready to formally welcome that technology into Mac OS X.

Tom Krazit Former Staff writer, CNET News
Tom Krazit writes about the ever-expanding world of Google, as the most prominent company on the Internet defends its search juggernaut while expanding into nearly anything it thinks possible. He has previously written about Apple, the traditional PC industry, and chip companies. E-mail Tom.
Tom Krazit
2 min read

SANTA CLARA, Calif.--Apple's Worldwide Developers Conference is expected to cover the parallel tracks of Mac and iPhone software development, but the company may have another aspect of parallelism to discuss next week.

Nvidia's CUDA technology could make it easier to transcode home movies--or hits like Ratatouille--into a format suitable for an iPhone. Apple

Nvidia CEO Jen-Hsun Huang, in an interview earlier this week, suggested that Apple might have plans for Nvidia's CUDA technology as part of the WWDC festivities next week. CUDA is a programming technology that allows software developers to take advantage of the unique parallel processing characteristics of graphics processors such as Nvidia's GeForce 8600M, found in the MacBook Pro. Nvidia released a beta version of CUDA for Mac OS X back in February.

"Apple knows a lot about CUDA," Huang said, implying the company might be ready to formally embrace Nvidia's technology to make it easier to exploit graphics chips inside Macs. Apple's implementation "won't be called CUDA, but it will be called something else," Huang said in an interview here at Nvidia's headquarters on Wednesday.

Software developers are interested in the potential of graphics chips because of their ability to embrace parallelism, or the simultaneous execution of different types of problems. CPUs from Intel and AMD are designed as general-purpose processors, able to handle any kind of code a programmer can throw at the chip. But until multicore chips became all the rage, those CPUs were basically designed to tackle one problem, and then move onto the next problem: and software for those chips has been designed accordingly.

GPUs, on the other hand, break up a problem into much smaller bits and process it in parallel with other problems at a very high rate of speed. To this point, however, only specialized applications such as graphics software or high-performance computing applications have been able to take advantage of that raw horsepower. Nvidia, AMD, and Intel are all working on ways to allow everyday programmers to exploit the unique characteristics of graphics processors.

For example, during my visit on Wednesday, Nvidia engineers demonstrated how a CUDA-enabled version of a program similar to QuickTime running on a desktop or laptop could dramatically speed up the processor of transcoding a movie or television show into a format suitable for the iPhone.

The GeForce 8600M GT is one of the Nvidia graphics processors that are listed as CUDA-enabled on Nvidia's site. Huang declined to share specifics regarding Apple's intentions, but a conference of Mac developers would be a likely place to discuss any plans Apple might have for CUDA.