Rob Cook '73 never achieved his college aspiration of becoming a theoretical physicist. But the man who fell into computer programming after college as a way to pay bills has done all right for himself. As the co-creator of RenderMan, a computer- graphics program used in the vast majority of contemporary films, Cook earned a 2001 Academy Award for outstanding contributions to the motion picture field—the first Oscar to be awarded for software.
"They give out the technical awards a few weeks before the show, but you still get to go to the main event," says Cook. "So I got to be there carrying my Oscar around. There's only one night you can do that, where you belong to the club. It was a kick, but it was kind of like Cinderella and the ball—the next day the carriage and horses go back to being a pumpkin and mice."
Still, the magic keeps happening. RenderMan has become the industry standard for three-dimensional animation and computer graphics. It's been used in every nominated movie in the Academy Award's visual-effects category for the past fifteen years. Regardless of your preferred genre of film, it's likely you have seen RenderMan in action, from family-friendly animated flicks such as Up, WALL·E, Ratatouille, and Cars to fantastic adventures such as Iron Man, The Curious Case of Benjamin Button, and all of the Harry Potter movies.
After graduating from Duke, Cook was unsure about his professional ambitions. His initial attraction to physics had faded over time. While working at Digital Equipment Corporation in the 1970s, he met the one person in the company working on computer graphics. Fascinated by the nascent field's reliance on creativity, mathematical modeling, and scientific expertise, he enrolled in Cornell University's master's program in computer graphics. His graduate thesis on computer graphics and simulation caught the attention of director George Lucas, who hired Cook straight out of grad school to work for Lucasfilm's fledgling computer division.
"Those were the early days of the field," says Cook, who is now vice president of advanced technology for Pixar Animation Studios. "Computer-graphics images were still primitive, a long way from what was needed for them to be widely used in animation or special effects. That made our goals seem ridiculously high, but in an odd way it was also what gave us hope—our understanding was so limited and our techniques were so crude that we figured there was bound to be a lot of room for improvement."
One of the first challenges they faced was making images appear natural. "We had to think about things like how light reflects off particular surfaces—copper, bronze, clay—and how to have control over the appearance of those surfaces. But that's not enough. For something to appear realistic, it can't be too pristine; there are always going to be imperfections like dirt or scuff marks that are part of what makes something look authentic. And we had to give the artists control over all of that."
RenderMan made its big-screen debut in The Abyss, followed by Terminator 2: Judgment Day, andJurassic Park, films that set off a revolution in motion-picture computer graphics. "After those movies, everyone wanted computer graphics, even though in those early days traditional effects were often cheaper and better," says Cook. Since then, computer-graphics images have gotten better and better, to the point that even Cook can't always tell when they are being used in a film.
Even though he has mastered countless computer-graphics challenges, Cook says there is still an elusive effect he and his peers have yet to perfect—a realistic human face. "If we look at a cartoon face or some other stylized rendition of a person's face, our brains don't have a problem accepting that. But the closer [the face] gets to looking human, a different part of the brain becomes engaged, and if something is not exactly right, it goes from looking realistic to looking creepy. Brad Pitt's incarnations in Benjamin Button came close. But we're still not there yet."