Pat Hanrahan and Ed Catmull helped pioneer graphics technology that enabled Toy Story and revolutionized computing.
Two original Pixar Animation Studios employees have won the Turing Award for developing the computer graphics technology that was key to creating Toy Story, the first feature-length computer-animated film, and that's now built into every phone and laptop. The award, sometimes called the Nobel Prize for the computing industry, went to Pat Hanrahan and Ed Catmull, the Association for Computing Machinery said Wednesday.
"It's a shocker," Hanrahan said about finding out he won the award, which is given annually without nominees being tipped off in advance. Catmull used the same word to describe his reaction.
In computer circles, however, the news is far from shocking, given how important computer graphics have become. Catmull created fundamental technology that helped move computer graphics away from crude wireframe models to visualizations of objects with curved surfaces. He also came up with texture mapping, which lets a designer effectively glue a 2D image to the surface of a 3D object for more realistic appearance. Catmull was the first president of Pixar , a 1986 spinoff of George Lucas' Lucasfilm that Steve Jobs funded and later led as chief executive.
Hanrahan, now a professor at Stanford University's Computer Graphics Laboratory, worked at Pixar from 1986 to 1989. He was key to creating the company's RenderMan software, which automates computer graphics tasks like determining how light reflects off object textures. Renderman was used for Toy Story, which came out in 1995, as well as every other Pixar movie. It was also used for 44 of the last 47 films nominated for a visual effects Academy Award, including Avatar, Titanic, the Lord of the Rings trilogy and several Star Wars movies.
The pair's work paved the way for increasingly realistic video games , such as Doom, Call of Duty and Forza Motorsport, and helped spur a renaissance in virtual reality. In a surprise development, the technology also is key now to supercomputers and for artificial intelligence tasks that don't use graphics. The computer graphics the pair helped pioneer spawned an entirely new industry making dedicated chips called graphics processing units. These GPUs proved adept at training AI software to recognize human speech and getting supercomputers to simulate nuclear weapons explosions.
You have to be a pretty big deal in computing to win the Turing Prize, an award that comes with a $1 million prize thanks to a financial boost from Google.
It's been given to the inventors of encryption technology that protects communications and e-commerce, the chip design used in every mobile phone and the World Wide Web. It's also been used to honor the creators of the graphical user interface, modern AI technology, and the Unix operating system.
The Turing Award was previously given to a computer graphics researcher, Ivan Sutherland, who led the University of Utah research team where Catmull earned his Ph.D. (For a look at the state of the art then, check out the video of A Computer Animated Hand that Catmull helped create in 1972. The Library of Congress recognized it in 2011 by adding it to the National Film Registry.)
For all their work advancing computer graphics, Catmull and Hanrahan are acutely aware of the technology's limits. Too much realism in computer animation leads to the "uncanny valley," where human characters look almost like the real thing but are also different in ways that people find disturbing.
Achieving photorealism has always been an inspiration for computer graphics research, but never the goal when it comes to producing movies. "Animation fundamentally always has been caricatured," Catmull said. "Everyone is aware of the uncanny valley."
That's why Pixar has always stressed traditional storytelling, Catmull said. You burst into tears because of Sulley's emotional connection to Boo in Pixar's Monsters Inc., not because the monster's hair looks convincing.
"You should be using the values in the story as the driver of the technology," Catmull said. "If you pay more attention to the visuals than to the story, it looks great but it's kind of boring."
Sticking to a story is a problem for VR, too, where creators can't be sure the audience is focused on the right part of a 360-degree world. "If you're looking all over the place, then VR is a barrier to getting engaged in the story," he said.
With each successive movie, Pixar pushed into new graphics domains, from translucent leaves in Pixar's second movie, A Bug's Life, to raindrops that oscillate when falling and that drip down windows realistically in 2019's Toy Story 4.
The amount of computing power available for computer graphics is amazing, Hanrahan says. He remembers when only the likes of Pixar and serious researchers could afford to pay $250,000 for a 512x512-pixel frame buffer, a memory module used to store image data for what today is a tiny image. The new Samsung Galaxy S20 Ultra, by comparison, has a 3,200x1,440-pixel screen that can be redrawn 120 times each second.
"We always had this vision that it would become more ubiquitous," Hanrahan said, opening new creative avenues for many more people. But he got the schedule wrong.
"Ed had this vision we could make a movie -- a moonshot kind of thing. But I was a little skeptical," Hanrahan said. "I didn't think it would happen in my lifetime."