Working on your first feature can be daunting enough without the added pressure of using new technology. Even the name screams caution – Red – “Danger Will Robinson. Danger”.
When I was first asked to work on Broken Hill, producers Christopher Wyatt (Napoleon Dynamite) and Julie Ryan (Ten Canoes) asked if I knew anything about the Red One camera system and after a moment’s thought I replied, “yeah, that it’s a nightmare for post production. Where do I sign up?” After spending the previous 12 months working on back-to-back documentary television series I was up for something with more technological challenges and this looked like just the ticket.
On a job like this I was blessed to be working with such an experienced crew and production team that most normal filmmaking concerns could be put to the back of the line and I could focus on the Red situation, at least in the initial couple of weeks.
From my first look at the camera I was excited. It looked cool. Now that may seem irrelevant, but let’s face it ‘the cool factor’ helps with enthusiasm to push through the not so cool parts (and there are quite few of those). To the best of my knowledge Broken Hill was the first feature film to use Build 15 of this camera, which is the first version to be released with sound recording capability. That’s excellent news for an editor like me who has only been working in the video world where sound and vision come in as one. Yes, that’s right. Laugh it up, but the truth is I am one of the many next generation filmmakers to have missed a crucial word in filmmaker – film itself. This may seem strange to many readers, but I’m sure in the coming years editors who have actually worked with film, perhaps even touched the stuff on a Steinbeck or Moviola, will be as rare as a DJ that still scratches on vinyl.
Anyway, back to the sound thing. Unfortunately as with many aspects of the Red system, they’ve worked out the production side of things before the post. How unfamiliar. The realisation that most of the post-production pathways do not accommodate the new imbedded sound (ie RedCine does not read the sound at all and is still considered a beta version) brings on flashbacks of that wonderful world of HDV. I remember the big hype about HDV and how it was going to revolutionise video making. How brilliant – an HD resolution camera that fits in the palm of your hand and uses miniDV tapes. I was one of the first guineapigs to test this format in the country (Australia that is). There were big problems and I can feel that searing pain in my frontal lobe returning just thinking about it. Final Cut had huge lag times for editing; the image could not be displayed on a broadcast monitor; most filter operations made the image fall to pieces; reconfirming was and still is a nightmare; and the pathway to master it back to tape and DVD was nothing short of excruciating.
So here I am again. A guineapig. We did some tests with sound and found that the one saving grace of the format is that the proxies did support the sound. Brilliant! Now we just needed to work out what format we were going to edit in. As there is no feasible way to work with the 4K Red RAW files natively, we knew that the edit had to be done in offline resolution. First we looked at using the Red proxies. For those that don’t know, this is an amazing feature of the camera - Red creates 3 proxies of each clip you record; a 2K, 1K and half-K. However, the proxies are not great for editing as they are prone to corruption. The proxy relies on the presence of the original 4K R3D (Red RAW) file and so cannot be moved. Displaying the pictures is also processor hungry, so even if you have an eight-core Mac, editing can be sloppy and there are delays when starting and stopping the play head. So we decided to go the transcoding path instead, which meant that all we needed to do was transcode the 2K proxies to HD ProRes and off we go.
Unfortunately sound, as always, is the last consideration of any film production. Just as I’d worked out the transcoding method the camera department decided it didn’t want the burden of extra sound cables hanging out the back of the camera. So, sound was recorded on DAT instead. Yes, that’s right. We were now dealing with one of the latest digital technologies and one of the oldest. This changed our post path dramatically as we now had the added task of synching and backing up DAT tapes. This ended up being a god-send as it created a new position for an assistant editor. If you can recall, before editing software became so common you could cut your friend’s wedding on your mobile phone, there were people called assistant editors. Their job was to generally help with the running of the editing department, which has since been degraded to one person and a PC.
Testing is important!
It’s extremely important when dealing with new technologies to test out your pathway pre-shoot. We didn’t. Well at least not entirely. Cinematographer Nick Matthews did do some early camera tests with the make-up department. Knowing that this was an entirely new beast he had taken the time to research the system and made sure he was not only given the camera well in advance, but that he could print some tests to film and see it on the big screen. One week before cameras were set to roll (yes, cameras plural) we sat in a local cinema and watched the mere 1 minute of film we could afford loop over and over again. The footage was nothing exciting, just some shots of the actors standing around and some camera cards, but it looked great. Nick described it as “pristine 35mm”. I agreed. Now we just needed to be as sure of everything else.
One of my main concerns from the beginning was storage. This is a big change to the ‘normal’ production pathway. As the format is a tapeless digital acquisition, the Red lends itself to the ever concerning area of data corruption. Hard drives are fickle little f….ummm….devices. As many of us know hard drives lose data, corrupt files and generally do strange things at the most imperfect of times. Now, add the general failure rate of this equipment in an ideal post production environment, with reasonably controlled temperatures, clean electricity and dust free environments, and then ask the question, “would this hard drive survive being shaken like a paint mixer from a helicopter flying 100 kilometers an hour in the harsh sun of the Australian desert while still recording 4K files 24 times a second?”
So you have a choice here. Say to the director, “hey, maybe we should pull down the scope of the film to just tripods and tracks”. Good luck. The best, well really the only option is, “roll camera and let’s hope for the best.” It turns out this concern was not unfounded. On some particularly bumpy patches the camera developed an odd strobing. I presume this is in fact the camera dropping frames as data is being corrupted on the way to our solid little friend. The production crew simply re-shot when this occurred, which was only once or twice, so no biggy.
What was a ‘biggy’ was that one of our hard drives died all together. Luckily we had an excellent Data Wrangler, Brie Walsh, who was very quick at keeping up with back ups while they shot. The drive was found unrecoverable, but because of the quick turn around of hard drives and flash cards we only lost about 10 minutes of footage.
I should probably mention here what a Data Wrangler is. This is a new production position created for tapeless digital formats. Essentially their job is to keep track of all the files recorded, make sure the files aren’t corrupt, check focus and other camera issues, and then back everything up. It’s an interesting position as it falls somewhere between camera assisting and post-production. I cannot express just how crucial this role is and how important it is to get someone with experience to fill this position.
Once footage was shot the motto was “back up…back up…back up”. That’s right. Three times. Firstly the data was backed up onto two external Firewire cases so that cards and hard drives could be formatted and sent back to the camera department. To do this a little program was used called R3D Manager. This also did check sums on the footage to make sure all was A-OK.
At the end of each day the drives were sent to separate locations just in case something happened in transit. One drive was put through Scratch, transcoded to 1080p HD ProRes files and then backed up onto LTO tape for storage in a fire-safe vault. The other was brought to the edit suite and backed up onto a media vault.
That’s a lot of backups. But hard drives are cheap, re-shooting isn’t.
This post production pathway was based on much research on the internet, especially on Reduser.com (an invaluable site for anyone taking on this format). The transcoding of the files can be done a number of ways, including simply using Compressor or Red’s own free software RedCine. However, we decided on Scratch simply because if you have Scratch you use Scratch.
When I first signed onto this project we worked out that based on a 20:1 shooing ratio I would need roughly two terabytes of hard drive space for the Red files plus a further two terabytes for the offline footage. Here’s probably the scariest thing about digital technologies. Hard drives are cheap and so directors shoot a lot and with multiple cameras. And I mean a lot. On some scenes we reached ratios of 90:1. That’s an hour and a half of footage for a one-minute scene!!!
This has a huge effect on post-production. While this may be fairly standard for directors like Spielberg and Lucas who can afford to spend six months in an edit suite, independent productions don’t have the same budget to sustain such a long offline process, so my advice is to still treat your shooting ratio like film. It may not be costing you in media every time you button on and off, but the crew standing around probably aren’t doing it for free and neither are the people who wade through it all at the end.
We used Final Cut Pro for the offline edit. I’m a big fan of this software and have been using it since version 1.0. In my opinion the best thing about this program over Avid is that it has far better user support and when dealing with newer technology the global users networks find all the bugs far quicker than the actual company making it.
At the time of writing this the offline edit is still underway, but here’s the plan on how it’ll get back to the 4K Red files and up onto the big screen. An EDL is exported from FCP and re-conformed in Scratch back to the 4K Red RAW files (be careful not to change the name of your offline files as they will need to match the original camera files). These are then graded and exported for printing back onto 35mm film. Sounds simple enough, but there are many dangers along the way which I’m sure will become apparent.
In the end the process hasn’t been that scary and for most post people it is, in actuality, more familiar than dealing with 35mm film. Red is in digital form from start to finish and for me this is a calming thought. I understand 1s. I understand 0s. Now I just need to work out how to get those numbers from point A to point B with out having too many flashbacks of year 12 algebra.
It goes without saying that Red is here to stay and it will certainly find a firm place in independent film production. The good news is technologies like this will allow for lots more independent production. The bad news is there’ll be lots more indi films fighting for the same market dollar. As for us post people - I look forward to the process becoming smoother and there being better support especially at the software level. But let’s face it, as soon as things look like they’re sorted out some other camera technology will come along and it’ll be back to the drawing board.