Jungle Capture

Above: The ARRI Alexa Studio camera follows Mowgli on-set. Simulcam gives the live production crew an idea of where the CG is and the lighting design has already been set through extremely photo realistic software in Maya or Renderman.

VFX Supervisor ADAM VALDEZ explains how in the latest adaptation of Jungle Book the capture of the single live character worked hand in hand with the groundbreaking VFX.

The success of this year’s adaptation of Jungle Book hinges on the belief that Mowgli is actually interacting with the animals. This couldn’t be a Mary Poppins-type ‘lever in the VFX’ job. The team wanted the feel to be like they had just walked in to the jungle and filmed it. One of the VFX Supervisors, Adam Valdez, explains how the current technology helps integrate live action in to CG.

“Typically for visual effects your job is to take a photograph and ‘integrate into’ that photograph. In this case you’re inverting that relationship, the photograph is the smallest part of the frame and it needs to fit into CG that isn’t done yet which makes the whole project a little bit more like what we would call ‘an element’ shoot, you’re shooting a component of a shot. Normally an element shoot’s in post production, you have your shot in place and if you’re shooting a component you kind of know lighting direction and you know what the camera’s doing and you have to shoot a piece that will fit into an existing shot. 

“Imagine you’re shooting a home movie inverted in a strange way so really that comes down to a lot of preparation and planning before you shoot. One comparison is the movie Gravity which actually is a more extreme example because when they were shooting actors, in that case, they used motion control rigs on the set to shoot everything in a super controlled manner, so effectively they had pre-visualised things and then they would shoot it that way. “This wasn’t quite that tightly constrained but therein lies the whole challenge. 

“So then there was a process of story boarding the movie and doing a pre-vis of the movie and that’s where a lot of motion capture techniques were used to give live action film makers a hand at doing pre-visualisation animation. Then the big key was working with a Director of Photography before the shoot to pre-light every scene. We did pre-lighting or lighting design in the computer so that we could essentially line-up every shot in the computer and figure out two things, how much art department work there was going to be on-set and what the lighting was going to be on-set. For instance if you have a kid walking through dappled forest light and then he crosses the threshold out in to open sun, you need to get all those lighting cues right in the physical space of the sound stage but you’re standing in a bunch of blue so you don’t actually have any physical landmarks to relate to.  So that’s why you have to prepare so that when you get on-set you know you’ve mapped out everything in the virtual world and you can give the DP some tools to visualise the lighting on-set.

“Then after that we’re in to more post concerns but that gives the high level view of the problem and the basic solution which isprep and taking the prep on-set.”

The team wanted the film to look like it had been shot on film with real cameras and real camera support.

The team wanted the film to look like it had been shot on film with real cameras and real camera support.

Blue Screen Set

“On-set you’d have the boy on a small piece of physical set work with a little bit of jungle floor that looks camera ready that might be something like 20 feet long by ten feet wide. We only built what we needed for each section of each scene and then the rest of the stage was blue. What we did in the computer (using software like Maya and Renderman) for the lighting design to help our DoP Bill Pope was to create a few key images that you can take with you on-set as a guide. It’s a step ahead of just looking at say reference photos or production design because you’re in the computer in the 3D space inside the computer software, it’s kind of lighting inside a video game or something like that. You’re standing in the set, you can look around, you can move your sun location and you can see how that looks on the characters and you place trees and other things and you can place other lights in there to get the lighting effect that you want so basically you’re working in a virtual space.”

Traditional Shooting

“Then you take those images on-set and it becomes very traditional, you have the kid standing on a patch of dirt and he’s got blue all around him and you’ve got 20Ks hanging and Gobo light makers and bounce cards and everything. You’re still shooting a boy for real so the on-set just looks like a traditional shoot, but it’s just that you’re walking on-set with your plan already made and you have a way of knowing in space on the sound stage where that will be someday within the sets that the computer CG artists are building.

“Again the most concrete example is that if he’s walking out of a forest in to open sun and he’s walking along and he’s walking 20 feet, you know that ten feet along he’s going to walk in to the sun – you need a way to say that you know that that later on is going to correspond to where that tree line is when the VFX are done.

“So you work out the staging, as in you start here and you end here, we would bring the images of the lighting design on-set. Like Google street view where you can look 360˚, those renders where you can stand somewhere and look in any direction. We did a couple of those 360˚ photographs, it’s like taking the video world and taking a 360˚ snapshot, then you look at it on an iPad in say in panorama view. We did that so they could stand on the set and look at every angle, over head from all different angles. You then know where say the trees are going to be and where the open sun’s going to be.  You have to bring it on-set with you in a way that make’s it practical to do your lining up. 

“There’s nothing actually brand new about this kind of on-set prep, it’s just the length to which we went and the level of planning and collaboration with computer artists and live action stage crew like DPs and gaffers. Normally we don’t really collaborate with the DP that much, they have the movie that they’re shooting and we match to whatever he or she establishes. Here we really had to work hand in hand. We needed every single shot to have a technical set-up and that’s why we had to do it that way. 

“The new technology that has really made the film happen is more on the computer graphics post side that has the ability to do all the fur and so on. It is related to lighting because the software we use to do lighting over the last five years has got a lot more sophisticated and does behave a lot more like natural light. It means that people who do camera and lighting in the computer now are working within a very direct simulation of camera and light effects and therefore the techniques of lighting in a computer and lighting on-set are getting closer to one another just because the physical simulation of light in the computer now is very physics based, very natural. That creates a more direct correlation and I think that’s why computer lighting can now aid in lighting design. You can work in a computer in a way where it behaves in way that a lighting cinematographer would feel comfortable that they can design in that toolset.

“The other thing we have available on-set is something call Simulcam. This isn’t brand new tech but is very clever. You’ll see it on most films that have a lot of blue or green screen. You either have your camera on a crane or a dolly which have measuring devices on the different axes. The movements of that camera support are then inputted into the computer and the computer is able to replicate the movement of the camera inside the software. So these camera movements are telling the software camera where to look. So you get a corresponding view in the 3D world of the future set. 

Director John Favreau and the film’s only on-screen talent, Neel Sethi, start the process of creating the river scene with Baloo the bear.

“Now you can do an A over B composite with video just on-set and you can see your foreground subject, real time with a blue screen key over a rendering from the computer software. You see a background being filled in in real time. It’s a little jittery and a little bit fiddly sometimes because if it’s not a pure blue screen behind the subject then the assist has to ‘garbage matte’ out the stage and so forth to give you a clear view. But it gives the DP a good sense of their composition and their perspective lines, they can see an horizon, they can see how much sky, they can see what objects are behind the subject if they want to re-think their angle, all sorts of things. The big thing here is that underlines all of this is that between the director, the DP and Rob Legato who was our Visual Effects Supervisor on Jungle Book in LA, they all had the intention of wanting this movie to feel in the end like we just went and shot this thing somewhere, we want the cameras to feel like they are responding to the action, we wanted to cameras to feel like they are grounded in physical production and we want exposures and final photographs to feel like something filmic and not a sort of new era digital heavily graded feeling. We wanted it to feel like a movie shot on film with real cameras and real camera support (It was shot on an Arri Alexa Studio camera).

“Like any film, the healthier the exposure and avoiding any bottom or top cllpping that gives you the most latitude in post of course is preferable. What we did on this film was, again same group of people, built viewing LUTs that were very neutral, did not have any colour biases and gave us all a Kodak print stock general shape and tonal response and number of stops. We’ve had experiences with other films where the LUTs were a bit whacky or too specialised or too colour biased and we’ve kind of suffered working under those. We were all determined to do something fairly neutral. There are some standards emerging with the ACES project which are attempting to establish some centre points or norms that everybody can agree to as a neutral starting place.

“We wanted to do our own as we were feeling that that wasn’t all locked in yet at the beginning of the production, in fact it’s not a long way away from it.

“Anyway we tested the LUT on set and Bill liked it and that was what he was viewing through on-set and placing his exposure based on those stops that we chose and just doing straight ahead photography from there. We didn’t ask him to alter his style of shooting in any particular way.”

Posted on June 15, 2016 and filed under vfx, moco, mocap, virtual production.