Anyone who went to the first BSC camera test last summer would understand the reason why MPC set up its Data Lab. There were a couple of digital cameras included in the test that weren’t handled properly as far as their file format transcoding was concerned and the results were very bad. The test results caused much consternation for proponents of digital cinematography and ultimately led to a second test with much more ‘real world’ conclusions made.
MPC did offer their help for the first tests but unfortunately the post routes were already decided upon and what they were trying to explain to the BSC was lost in translation – a bit like the first results camera data.
MPC’s Data Lab was formed to deal with the increasing number of data camera formats and their translation in to the post production, services which luckily can be offered by the rest of the building in Wardour Street. They work with any camera for any reason to go to any post route.
But maybe even MPC didn’t see the rise of the video DSLR on the horizon. Their advice on using the Canon EOS 5 and 7D is very pedagogic, as a lab would be, but their solution is unique which stands as a good illustration of the Data Lab philosophy.
Officially MPC’s Lab question the camera compression found in the 5D MkII, 7D and ID MkIV: “It cannot be pushed too far in the grade before it starts to break up”. Warns people not to use it for green screen and not to expect much image manipulation because of its 8-bit performance. They also don’t pull any punches about the aliasing, rolling shutter and frames rates. But if people are going to shoot with it MPC have a solution.
Chris Vincze: “The two main problems with the Canon were that it had no metadata at all, no timecode, no reel IDs and initially there was a frame rate problem. We set about writing our own software to be able to embed the files with timecode, metadata, convert frame rates and design a way to work with the footage that would fit in to a traditional editorial workflow. So the EDLs at the editorial house actually made sense, because if you just pulled the footage straight in to an AVID and spit out an EDL, the EDL wouldn’t make any sense. So we had to design a system to make those EDLs make perfect sense to anyone trying to conform footage.
“We wrote our own scripts which are designed for managing footage and writing metadata. We use basically a couple of commands that re-pitch stuff to the correct frame rate if needed, will embed the timecode, reel ID and simultaneously convert it to the formats needed. Most of our clients are working on AVIDs so we’ll create AVID MXF files to the format they want whether that’s PAL or DNxHD. They will have all of the clip metadata that we put in.
“We use the same scripts for whatever format we’re transcoding whether its Phantom, RED, P2 footage. We extract all the metadata and convert it all to the required format and write that Metadata back in to the AVID projects.”
RAW files are exactly the same, the only thing is that it’s goes through a slightly different pipeline so there are tools for converting RAW files, for example if you’re working with RED it goes through RED’s command line, they have other tools for Phantom and Silicon Images files for instance. “All the different codecs have slightly different path ways but that’s all automated. All you have to do is plug in, tell it what the source is and tell it what the output is and it works it all out.”
In a way that’s what makes the Data Lab earn it’s keep, the file transcoding but also the speed they can push formats through. “Part of it is the speed with which we can turn around this kind of footage for example with the Canon footage we’ve had two huge jobs through. One was for Unicef where they had 21 hours of footage and another one for Canon themselves and that was 35 hours of footage. They were dumping the footage on set to a hard drive and bringing us the drive. We took the hard drive in, the Unicef job we processed in about eight hours and the Canon one was processed in under 12 hours. So there are a huge amount of savings to be made, in fact the Canon footage started somewhere else and it had taken them three days and they still hadn’t got any footage!
“Part of it is that we’ve a lot of our own software and continue to add any new modules for any new format that comes along.
“My philosophy was always to try and use software and hardware that was consumer-based so rather than spending £100k on a Baselight just for doing dailies processing you have a few Mac Pros, gang them all together. They can obviously do other things as well. The advantage of writing the software that takes all this footage is that when a new format comes out you don’t need another machine to do a separate job you just need to write the requisite software module to do the job. So even though it’s a big workflow to set it all up, it’s really future proofing.”
As they proved when they had to process 56 RED cameras in a hotel room in Canada last year for a Nike job. They were shooting the cameras simultaneously and used 32 iMacs in the hotel room to process the footage.
“As far as the Canon is concerned I think we are unique in our ability to re-write all the Metadata back in to the Canon files, I don’t know anyone else who does that. On the other hand everyone is processing RED files so we concentrate on the speed of turnaround and makesure it doesn’t interrupt editorial schedules. A lot of the footage especially in the commercials world is such a quick turnaround it needs that level of service and the fact that we do a proper QC on everything.”
A lot of the jobs MPC do they manage the process from capture to delivery, “We’ll coordinate and provide a workflow structure with the DITs, sound recordists, editorial, vfx pipelines and DI. We’ll manage that in terms of workflow and colour science. It’s something we’ve been doing at MPC in terms of vfx pipelines for years, so we’re harnessing that knowledge of colour science and basically translating it in to the digital world.
“A lot of the time it will be just recommending particular camera formats for specific job. At the moment a lot of the different camera formats have pros and cons. For example you wouldn’t do any green screen work or 3D tracking on the Canon cameras. In my experience if you have lots of compositing work then you might want to go for a RED with a bigger image size, if you really want to get in to the detail you can work on 4K plates. If you want to do lots of 3D tracking work and lots of fast pans and maybe don’t want to go with a camera with a rolling shutter, you maybe want to go with something with a global shutter. Use a Sony F35 or a Genesis. Each of them have different pros and cons.
“We try and give people an overview on why they should choose a particular camera. From on set reasons to a post point of view, how these things affect us will have an impact on best choice of camera for the job.
“The other thing is to work with DITs in terms of how best to set up the camera and to give the best workflow down the line.
“The DIT is increasingly a vital job. It also depends what people mean by the term. There is a definitely a difference between Europe and the US but also within the UK there are lots of different descriptions of the job. In the US DITs purely concentrate on the image, they won’t necessarily go near the camera, there is probably someone else who is managing the data back-up. In the UK often it’s just data backup and they won’t necessarily understand the imaging aspects of it.
“There’s a huge amount of grey murky area at the moment and it’s something that we’re working with people like BSC to establish the DIT as a proper camera department grade. So we can outline exactly what the roles are. We’re going to see a situation like in the stills world where in the space of one or two years film use will decrease. There will have to be a particular reason for shooting film because the budgets will be so constrained as film gets more expensive as less and less people use it. Digital workflows will become cheaper.
New Cameras and Stereo
We asked Chris what new cameras are of interest at the moment? “The new Arri cameras look amazing and also the new RED sensor Mysterium – X. I think those two cameras are really going to push the envelope as to what’s achievable with digital cameras. When people will be able to shoot at say 800 ASA or 1000 ASA with no noticeable quality difference then that’s going to affect budgets like the lighting. So the effect of these new cameras propagates throughout. When you can save that much money on some of the technical aspects then you’ve got more time to shoot, you’ve got more choice.”
Its early days for the use of digital in feature films, the Data Lab however has already had clients through the doors including footage from Sherlock Holmes which was the Phantom slow motion fighting shots, in Kick Ass there were a few Phantom shots, In Robin Hood there were the Phantom shots of Robin shooting the bow and also some Genesis footage. They are mostly special shots in a 35mm environment, specific shots for specific reasons.
“But we’re also talking to a few clients at the moment especially about the use of stereoscopic material. We’re seeing a much bigger switch to digital purely because they’re shooting stereo. The workflow is so much easier with digital footage than film. Things like matching stabilising plates are so much easier with digital.
“We offer a 3D daily service which we’ve just launched. 3D dailies are just that much more complicated because primarily you have to process twice as much footage. On a monoscopic shoot it’s mostly creating dailies using embedded metadata, timecode. With a stereo shoot you’ve also got the whole stereo correction and depth grade processes you need to go through. There’s no automatic way of doing that. We’re providing a service to let editors view their footage in stereo, so we could provide a side-by-side stereo version as well as a mono version and that’ll match up with the timecode and metadata of the footage they’ve already got.
“In that kind of workflow the corrections have to be baked in, there’s no way of changing them on the fly. As we see more 3D enabled editing systems you’ll be able to edit them a bit more and change the stereo as you’re working with the footage.
“For example Cineform’s NEO 3D QuickTime plug-ins or the new AVID stereo workflow will enable that. But what we’re doing right now is for people who don’t have any additional hardware. The other way we can do it is let people edit in 2D and then to get them to send us their rough cuts to have a look at it in 3D. Then we can conform it all up, do a rough depth grade, do the matching and they can watch it here in stereo.
“We use a Iridas FrameCycler Speedgrade set-up but if it’s something that we need to render out then we can do it with out Nuke tools. For dailies though it doesn’t have to be so exact as along as it’s viewable and comfortable to watch. For the final conform version we’ll use different tools that take longer and get a better result.”