An interview with Lucio Arese on Slit-Scan Crash Test
"Slit Scan Crash Test" is an experimental short film depicting a full-CG car crash test in super slow motion, using simulated slit scan photography effects that seem to warp time and space. The crash sequence is explored and reproduced several times at different camera speeds and angles. We interviewed its director and producer, Lucio Arese, on how he used Arnold for this project.

Hello, can you tell us a bit about yourself?
My name is Lucio Arese, I'm an Italian independent filmmaker. Since 2008 I have worked on a wide range of productions spanning from commercials, music videos and experimental cinematography. My work involves a great deal of CGI and is focused mainly on the correlation between music and motion pictures. Some of my past projects have been showcased worldwide in festivals like Ars Electronica, Ciclope, SIGGRAPH, the Saatchi New Directors Showcase and many others.
What's the story behind this project?
This is a non-commercial personal project intended for artistic purposes, just for the sake of experimentation. It had a very long production process, redesigns and changes on the way for a number of reasons. I financed the production out of my own pocket. The concept dates back to 2012 when I released Mimic, a music video that develops abstract 3D objects at frantic speed. I thought that it would have been interesting to do something completely on the opposite side, using super slow motion. This is where the idea of a crash test came from and I started working on it in my spare time. Serious production work started in 2013.
What modeling and animation software did you use?
The project was originally created in 3ds Max and completely developed in Mental Ray for production rendering. Arnold came at a later time.
Why Arnold?
Once everything was complete on 3ds Max I started wondering how to get the whole work rendered: I had many thousands of frames to get done and I was needing for a renderfarm. In 2014 Johannes von Liebenstein of Lovestone Film (who was representing me in Germany at the time) put me into contact with Martin Herzberg and Barry Kane of Munich-based Trixter. We went into talks about the possibility to get the rendering job into their production facility. They suggested me to pull out the whole work from 3ds Max, translate it into Maya and reshade it with Arnold in order to use their farm. I didn't know Arnold at the time and I got curious, seeing it also as a possibility to learn new tools for production. Despite having to get back almost from the beginning, I took the challenge.
Was it difficult to transition to Arnold?
It was frustrating at the very beginning because everything was new to me, including Maya. Plus, I had a number of issues exporting the geometry from 3ds Max to Maya, resampling the animation framerate and synchronizing all the geometry caches. But after the first moments of grief, I went well into the Arnold environment and got to work with it very easily and quickly. I found it extremely powerful and flexible, with very few settings to play with. Once I got familiar with it, the reshading process went smoothly and I managed to get even better photography than before.

What were the main rendering challenges?
Basically, render time: I had to find the best compromise to get good image quality wihout exceeding on render time and Arnold was of great help on this point. Even with the lowest AA settings and samplings, I managed to get satisfying quality, and I was able to fine tune them easily depending on the shot I was working on. The end result looked good enough to me and allowed to stay in my rendering budget.
Did you use any references for lighting?
In the beginning I tried to use IES lights to have a very precise lighting model on the crash testing facility, but I eventually discarded them and used simple Maya spotlights instead, trying to have a more flexible lighting environment that I could adjust by eye and in compositing where necessary. Light management in Arnold is easy and has a very useful and intuitive function, the exposure, which works like an f-stop value that permits to "move" the lights at the desired value up and down with no effort, getting it close to what you want in little time. Lighting has been one of the most effective aspects in the whole work.

Where did the original car model come from? Did you have to simplify it for rendering?
Not at all: it's a relatively simple car model I downloaded for free years ago (credits go to the modeler moosTAF). It contained the SLK base chassis, wheels and some interior. I modeled all the rest and created all the dynamic simulations to animate the impact: shattering debris, broken glass, object deformations, flying cables, motion of the dummies, exploding airbags. At this point the model become quite big and complex, but Arnold crunched it without the slightest problem. I initially used stand-ins for a bunch of objects with many subdivions and subtle displacements (like the facility concrete), but at the end I found they weren't even necessary. Arnold performed even better with the whole model as is.
Where did the "slit scan" idea come from?
I got interested in this technique years ago and I've always been fascinated by the effects you can achieve with it. It's fascinating how it can alter space and time in a lot of ways, both in photography and motion picture. I thought for a long time about the possibility to do something to experiment with this technique, and when I got the idea for this work and went into it I thought it was the right occasion. Plus, I had the possibility to blend it with slow motion, creating a strange, somewhat surreal dimension.
The slit scan effect was achieved in post production, working with black and white maps used to time displace and distort the original rendered shots. The making of video shows a couple of examples about the simulation of the effect. One big reason of grief to me was to see rolling shutter effect introduced on the last versions of Arnold, a little while after I completed the production! It would have worked just perfect for me to include this effect directly on the render.
What version of Arnold and shaders did you use?
The Arnold core version I used is 4.2.2.0, the MtoA plugin version used for production was 1.2.3.1. AiStandard materials were used in the shading of the whole model.

Why did you chose to render on Zync?
Zync was pointed to me again by Martin Herzberg: they were involved in a lot of big projects and weren't able to find a slot for my work, so he suggested me to try this new service acquired by Google that used the Google Compute Engine for cloud rendering. It could fit very well for small production studios and even individuals like me. At the time it was end of 2015 and the project had already been completely redesigned (I even changed the whole edit and the soundtrack, using the wonderful music composed by Craig Vear). I calculated an estimate for the whole job and decided to go with it.
How was the Zync experience?
I must say that it wasn't easy setting up the whole thing: I had to get a bit into Python scripting and encountered strange glitches in uploading the assets but once everything went up, it's been just great. The very nice thing about Zync is that once you've synced all your assets online, they stay in the cloud and you're able to launch render jobs anytime you want with very little or no data transfer at all. It becomes a small-sized rendering facility at your disposal (base accounts have 50 virtual machines of different types available, which is quite good for a lot of purposes). It's a light and well crafted solution that works directly from your host application and if you're into Python you can get as much flexibility as you want.
What were your render times?
On my main dual Xeon 28-cores / 56-threads workstation the heaviest passes required 8-9 minutes, the quickest 1-2 minutes. Depending on the specific shot I would say that an average render time for all the passes could be 5 minutes. Zync 32-cores machines performed a bit slower than that. Anyway, in a few hours I was able to complete an entire shot which would have required days or weeks for me: I was able to download all the frames, do main compositing work, launch other render jobs and so on. It's been an efficient pipeline.

