We use cookies on this website. By using this site, you agree that we may store and access cookies on your device. Find out more and set your preferences here.
Image courtesy of Framestore

Set to be this year's SciFi blockbuster, we talked to Martin Preston from Framestore about the challenges they encountered with Gravity

Can you tell us a little about yourself?

I'm head of R&D at Framestore, so I lead the team that’s responsible for the proprietary tools we use here.

What prompted the use of Arnold on Gravity?

We began worrying about rendering Gravity in the summer of 2010, and most of the worrying was about the length of the shots in the movie and the level of realism we needed to deliver.

Up to then our rendering approach had tended to rely on point based methods, and we were concerned that on a show like Gravity the lighters would spend more of their time managing the data than they were lighting. As a result we started to work out whether we could switch to a fully raytraced approach.

We experimented with a number of renderers, as well as modifying our existing approach, before settling on using Arnold in late 2010.

Why did you choose Arnold?

A mixture of things really. Firstly it's very fast. Secondly you can really tell its been through many VFX productions before, so it provides all the flexibility you need. Finally the Solid Angle team were able to work closely with us. Changing a renderer in a sizeable VFX house like Framestore is mildly terrifying, so being able to work closely with the developers helped allay our fears!

How did you go about transitioning to using Arnold?

We wanted to make the switch as transparent as possible, both to the lighters and the pipeline, so once we'd committed to using Arnold we modified all our R&D tools to make Arnold look much like our existing renderer to the artists.

Our main lighting tool, fRibGen, allows lighters to generate render passes that manage the application of shaders & data to geometry. It was straightforward to add a plugin to that system (fArnoldGen) to present Arnold passes, so that lighters could continue to work in pretty much the same way they always had.

Because we make extensive use of rendertime procedurals we also needed to port our geometry, particles, volumetric, hair & crowd procedurals over to Arnold.

The end result of all this work was that lighters were able to carry on working with familiar tools, albeit with a substantially reduced number of controls they needed to play with, and there was very little pipeline development required.

What was the most difficult part of the switch?

We had expected the process of moving all the tools over to be quite time-consuming, but in practice it was quite straightforward. Arnold's history of use in film VFX means it tends to have many of the obscure features we were reliant upon.

However, one interesting thing that came up was how we needed to reevaluate our approach to optimisation. In particular we'd been used to a rendering methodology which rewarded streaming geometry into the renderer when required. Once you're shooting rays everywhere you usually find that there isn't a great deal of benefit to delaying generating geometry, as almost everything needs expanding.

As a result we put a lot of development effort into speeding up how we expand our procedurals, in particular exploiting threading to accelerate the process of passing data in Arnold. On some of the extremely complicated scenes in Gravity this made a substantial difference to our render times.

Did you have to write any custom shaders?

Yes! We're used to needing to write custom shaders for our film work, and so when we adopted Arnold we needed to accommodate how we'd previously thought of developing looks for assets. Our head of shaders, Dan Evans, and his team (Nathan Walster, Jami Levesque, Adrian Bell and Lucy Salter) were also keen to present a familiar interface to the lighters. Consequently we ported our existing shader library over to Arnold, which meant that the lighters didn't need to learn a new way of developing the look.

We worked closely with the Solid Angle team while this was going on, and they were able to add new features to the renderer to help the transition. The addition of cylindrical lights was a particular favourite of our lighters, as was support for the tangent calculation we used extensively on the many (many) brushed metal surfaces.

Were there any particular rendering challenges?

There were a few specific areas of shader development during the show. In particular we put substantial effort into developing a novel toolset to render the many cloth surfaces in the film, as well as a related importance sampled approach to hair rendering. Ian Comley, one of the show's lead TDs, touched our cloth shaders the most and was really able to push the realism of the shader setup.

The rendering of the earth asset was another challenge and one of our lead shader writers, Nathan Walster, developed a volumetric approach (which raymarched the atmosphere, clouds and terrain data set) building on a variety of Framestore tools, such as our volume file format (fVox) and FX rendering tool (fMote).

Was it easy to render the show?

No! Rendering big shows is always a challenge, and Gravity was no exception. We put a lot of development effort in to optimise renders, both on the R&D and shaders side. In particular we took care to manage the complexity of our shading networks (i.e. the assembly of coshader units which were plugged together to manage the look of geometry) to ensure they weren't needlessly complicated. In some cases we wrote dedicated 'monolithic' shaders for those cases where graphs were used so often there was a benefit to simplifying the workload.

Other optimisation work included implementing a technique know as 'russian roulette' to reduce the number of rays we fired, using portal lights to focus rays better, and techniques to improve the calculation of ray differentials to reduce the load on our servers. Lastly one of our TDs, Alex Rothwell, developed a way of caching illumination between eyes to accelerate the stereo rendering we used for final delivery.

What do you think Arnold brought to the show?

I'd say simplicity and predictability. The simplicity comes from the lighters not needing to understand as much of the mechanics of how the renderer operates, so they have fewer things to worry about.

The predictability comes from our production staff now being able to confidently predict and prioritise when renders will finish (even if the predictions are alarming). Previously, managing rendering on big shows always seemed to involve a lot of per-shot or per-frame 'surprises', usually where frames would require unusual amounts of memory, or would present unusual optimisation challenges.

On a big show that's a real bonus!

What resources did you need?

It was a large show for us. We worked on it for just over 3 years, and at least 400 people worked on it during that period. In the render crunch we had about 15000 cores working on the show, and about 600 Terabytes of disk space serving all of that!

Gallery