X

Intel researchers shine light on ray tracing

Co-director for Intel's Tera-scale computing research program and a colleague offer a primer on the company's work on future graphics technology.

Brooke Crothers Former CNET contributor
Brooke Crothers writes about mobile computer systems, including laptops, tablets, smartphones: how they define the computing experience and the hardware that makes them tick. He has served as an editor at large at CNET News and a contributing reporter to The New York Times' Bits and Technology sections. His interest in things small began when living in Tokyo in a very small apartment for a very long time.
Brooke Crothers
5 min read

Brighter, crisper images are the goal for top Intel researchers in their work on future graphics technology.

I talked Wednesday with Intel's Jerry Bautista, the co-director of the Tera-scale computing research program, and Daniel Pohl, an Intel researcher. I focused mostly on a concept called ray tracing but also questioned them about Intel's upcoming Larrabee processor.

Reflections: ray tracing versus rasterized graphics
Reflections: ray tracing versus rasterized graphics Intel

First, some background. Ray tracing--whether you agree or disagree about its viability--has been a fairly hot topic. It has been mentioned frequently by Intel over the last six months. An Intel blog titled "Real Time Ray-Tracing: The End of Rasterization?" and later comments by Intel executives that the company is looking at doing ray tracing on its processors set the stage for debate on the viability of ray tracing in mainstream gaming.

Ray tracing is a technique for rendering three-dimensional graphics using complex light interactions, allowing the creation of extremely detailed reflective surfaces, for example, with stunning photorealistic results.

In the future, ray tracing may compete with today's traditional raster-based graphics used in games running on Nvidia and AMD-ATI graphics processors. Intel claims ray tracing runs better on general-purpose processors, such as its Core 2 Quad processors, than on traditional graphics processors. Ray tracing may also run on future processors such as Larrabee.

Intel CEO Paul Otellini alluded to this at a Sanford C. Bernstein & Co. Strategic Decisions Conference last month. Asked who Intel's major future competitors are, Otellini responded, "In graphics, as we move up the food chain, we're bouncing into ATI via AMD and Nvidia more than we used to. And I don't expect that to abate anytime soon."

Ray tracing does move Intel up the graphics food chain: it excels where raster-based graphics falls short, according to Pohl. "What often happens is when you zoom in on a (raster-based) reflection, you can see the resolution limitations," he said. With ray tracing "we can zoom in as much as we want" without quality degradation, he said. "The rays get bounced off and follow the reflected path and that way we get the physically correct reflection," he said. (See image.)

In response to a question about the level of processing power necessary for ray tracing, Bautista gave an example of a demonstration that Pohl did of an object between two mirrors and the hundreds of reflections that are propagated, each image getting smaller and smaller. "There's a whole lot of computation to make that happen." But he adds: "The kind of images that (Pohl) was creating were very complex and difficult and they could not have easily been done in a rasterized approach."

Bautista said that to handle graphics like this, the more processing cores there are, the better. "In the many-core world we can use as many transistors as we can get. A general-purpose compute engine like a CPU, whether it's Larrabee or any of our processors, will efficiently compute these ray tracing problems. The more cores we have the better. Provided that we can supply memory bandwidth to the device." (Update: Bautista did add that memory bandwidth is a limitation. Also: when Intel refers to "many-core" it means not only quad-core processors but the small mini-cores found in a processor such as Larrabee, which is expected to have many tiny x86 processing cores.)

And highly parallel tasks like ray tracing scale well, he said. "Highly parallel work loads have been simulated (at Intel) in detail, executed out to thousands of cores. And we find that these applications scale linearly: if we double the number of cores, the throughput roughly doubles."

Ray tracing and computer games
Where will ray tracing first appear in the mass market? Would ray tracing appear in a game like Crysis? "It will be in games that are at the cutting edge," Bautista said. "There are other game developers that tend to focus on the aesthetics of games. Interesting surfaces that are reflecting, space scenes with brilliant sunshine. Those kinds of companies will naturally gravitate to ray tracing sooner than the ones that won't require it," he said.

"Just like any other cutting-edge technology, like anti-lock brakes, they all initially appear on high-end things and then eventually find their way to the masses," he said.

But neither Bautista nor Pohl think raster-based graphics is going to be replaced soon by ray tracing. "It's not like raster graphics is going to go away. We're not saying anything like that," Bautista said. "Ray tracing and raster graphics will co-exist for quite some time. When the hardware and software scale well and when the needs of the user push you to ray tracing, when that happens we don't know. It's not likely to be some hard cut over at some point."

In the latter half of the interview, Bautista expounded on ray tracing and Intel processors. "With regard to execution, at its heart, ray tracing is the collision of rays with surfaces. So it's really collision detection. And collision detection, whether that's light rays or bodies in a game that are colliding with one another or a bullet that's fired from a gun in a game that collides with a wall, those are all basic physics," he said. "And ray tracing is nothing more than basic physics."

"Physics in itself is a very general-purpose computing problem. And physics fit very well on a general-purpose compute engine. It doesn't mean you couldn't do it on a GPU (graphics processing unit) but we'd say it tends to fit a little better on a CPU (central processing unit)," Bautista said.

Finally, he talked about the application of Intel processors to game artificial intelligence (AI). He said people seek online games because they want intelligent interaction. "Online games are so predominant because the computer is boring. It doesn't do things that are interesting and peculiar and unpredictable (in the same way another person is unpredictable). What if the AI engine in the game was sophisticated enough that it did things that were interesting and unpredictable? You would be happy playing that game and not going to the Internet."

"Today, 16-core but, gosh, I wish I had 64-core so I could do some AI and some real physics," he said.

Does that mean a 64-core Intel chip is coming? Intel's not saying. Not yet at least.