We use cookies on this website. By using this site, you agree that we may store and access cookies on your device. Find out more and set your preferences here.

Framestore head of R&D Martin Preston talks about Arnold in Guardians of the Galaxy

Marvel Studios' box office hit Guardians of the Galaxy is now the highest grossing movie of the year, and a sequel has already been announced. For Framestore, which delivered 633 final shots, Guardians provided several challenges; from a talking, photo-realistic raccoon called Rocket; to a three-mile wide city built inside the mined-out, severed head of a giant celestial being - Framestore’s most complicated environment to date. We interviewed head of R&D Martin Preston to find out more:

Can you tell us a bit about yourself and the history between Framestore and Solid Angle?

I'm head of R&D at Framestore, so I lead the teams developing the proprietary technology we use on all the films we work on. We spend a lot of our time working on the technology we use to render our movies, and as part of that work we switched our primary renderer to Arnold for Gravity, back in late-2010/early-2011. Since then, every show Framestore has worked on has used Arnold.

During this time, Framestore's R&D team, and shader department, have worked closely with the Solid Angle developers to push our lighting technology forward. Arnold’s developers have been amazingly patient with our dumb questions! Plus we eat all the free sushi that Marcos will provide.

What was the size of the team that worked on Guardians of the Galaxy?

The Framestore crew on that show peaked at 288 people.

Which modelling, animation and texturing packages were used?

We model and animate within Maya, and primarily paint textures within Mari. However there is a fairly large set of proprietary tools layered on top of those to help improve the way our artists collaborate. For example we have our own geometry/baking format (something called fBake) to pass data back and forth between packages.

Can you talk a bit about the Knowhere environment?

From a technical point of view, I'd say our shading of Knowhere was the biggest challenge in this movie. It’s a hugely complicated set, with an even more complicated lighting rig (in excess of 10,000 lights) in a smoky atmosphere.

In the past we'd struggled with rendering interior scenes that had a significant number of occluded lights, so we had begun to experiment with using our own bidirectional path-tracer on top of Arnold's ray intersection base (we'd always jokingly referred to these experiments as the 'van Damme' integrator).

For Knowhere we felt we needed to press these experiments into production, so we wound up rewriting our entire shader library to cope with this (as all our shaders needed to cope with being used in a bidirectional sense). As part of our bidirectional work we also needed to modify our custom volume lighting tools to allow us to have fire and explosions illuminate the set in a way we could sample well.

All that said, despite the complexity of the tools we've built on top of Arnold, we're still heavily reliant on the Arnold core’s ability to process large amounts of geometry. The full Knowhere environment, at 1.2 billion polygons, was typically fitting into less than 30 Gb.

Rocket looked really nice, what fur shaders did you use?

They're proprietary shaders. We seem to spend an inordinate amount of time on every show working on our fur-shading setups, and Guardians was no different!

Did you use a lot of sub-surface scattering?

Yes, on Guardians we used it primarily on the two CG lead characters, Rocket and Groot, as well as the sundry alien prisoners in the Kyln sequence.

Did you use Arnold’s IPR for interactive shading and lighting?

We use a proprietary lighting tool, fArnoldGen, to allow our lighters to sit within Maya and yet interact with Arnold. fArnoldGen provides IPR facilities which some of our lighters use to help accelerate blocking.

How did your tools improve since the beginning of this project?

During Guardians we didn't need to significantly alter fArnoldGen. However, we did substantially change our render-time procedurals (and in-Maya proxies) to handle large environments. The toolset, called fShambles, was rewritten to improve the threading used to accelerate procedural expansion. We worked closely with the Solid Angle guys, Thiago Ize in particular, to radically improve that. A part of the work Thiago did in response to our pestering is now part of the official Arnold 4.2.

Similarly we extended the technology we use to handle FX rendering (a tool we call 'fMote'). Our lead developer on that tool, Per Karefelt, substantially extended it for Guardians to help cope with the scale of the volumetrics in and around Knowhere. He also worked with our R&D chap, Manuel Gamito, who wrote our custom raymarcher to optimise how we could render such enormous datasets.

What version of Arnold did you use?

The bulk of the show was rendered with 4.1.x, though we were testing early pre-release builds of the 4.2.x series.

How big is your render farm and what were your render times?

We peaked at using about 15,000 cores for Guardians of the Galaxy, with the rest of the farm being devoted to the other shows being produced. The render times vary wildly, so it’s not really possible to come up with meaningful times.