By Todd Gallina, Trace3 VP of Brand Strategy and PR
Oliver Meiseberg – VP Renderman – Pixar welcomed the jam-packed crowd to the 50th Siggraph event and the 20th Renderman celebration at the show. Each year Pixar has given away a collectable teapot. This year would be no different. Why is a teapot the mascot for Pixar’s Renderman? I think that needs to be the topic of a completely different blog post. After Oliver left the stage we were introduced to Dylan Sisson – (Renderman Marketing Manager) who took us through some history of where Renderman came from and how it’s evolved... and required more firepower. Each feature has required more compute. It started in 1995 with the original 294 Core Sun Microsystems render farm to make Toy Story and has grown with every film.
The animation software Renderman has also matured over the years. Version 25 is what was used for Elemental, and for the first time in it’s history, individual artists were able to render full scenes at their individual work stations. It was then announced that Renderman 26 is to be released in Q4 of this year. I learned denoising images quickly is key to speeding up the animation process. Thanks to Renee Tam – (Pixar Global Technology Supervisor on Elio) I learned that whenever a feature goes into production, it is assigned a technology supervisor. Renee was selected for Elio. Her job is to monitor all of the tech going in and out of the movie. She took the stage to specifically talk about denoising on her film Elio – the NEXT film from Pixar. She demonstrated the upgraded denoising speed, and she also shared some small snippets from the film. Looks awesome.
Pixar is not the only studio using Renderman.
Steve May (CTO Pixar) took the stage and reminded us that Pixar’s Elemental was a landmark moment in computer animation history. It was the most technically demanding feature of all time, and he thanks Trace3, AMD, Supermicro and nvidia for all we provided to deliver a technology solution that can handle Elemental. He went on to say that industry collaboration is essential and announced the alliance for Open USD. Open USD will allow different studios to use a common file format for environments and props, et cetera saving technology teams hours of conversion time.
Finally, maybe the biggest treat of the night was learning that Gareth Edwards, the director of Godzilla 2014 plus Rogue One: A Star Wars Story, will be taking the stage with Jay Cooper ( Visual effects supervisor, ILM) to discuss their latest movie The Creator. You guessed it, the movie was rendered using Renderman. After asking everyone to put away their phones and begging them not to record what they were about to see, the audience was given a sneak peek at The Creator, which will be released on September 29, 2023. The roughly 6-minute clip was pretty damn awesome. After the clip, Gareth shared with the audience that he used to be a visual effect artist. And, because he wasn't very good at it, he had to become a director which drew laughter from the crowd. He did say that his first movie (Monsters) was small and that he did much of the visual effects himself. Now he is working on big budget films, but still stays very interested and involved in technical decisions because at his core he is still a visual effects artist. For his current movie he and his team did something that has never been done before - they shot in eight different locations without using any green screens or motion capture! Then, they did the visual effects over the footage they shot. This movie consists of three different races of humans, robots, and sentients, and it wasn't decided which background characters would ultimately be part of one race or the other until they saw the footage. Since this movie is about how AI takes over half the planet, he made sure to insert a few comments about how AI could be threatening the creative process. He really had no idea when he started this movie how soon generative AI would have become part of our everyday world.