The efficiency of game engines in processing polygons and assets is enabling a live VFX option from feature films to live broadcasters; let the games begin
In a perfect storm moment at the Games Developers Conference in March this year Epic Games and The Mill showed how VFX was now possible as a cinematic live element.
This perfect storm had actually been brewing since The Mill showed its automotive reskinning technology in Cannes last year where it won an innovation award. Before that in 2014 though, the Unreal Engine was opened up for anyone to use in return for a profit share of whatever was ultimately created. The genie was well and truly out of the bottle and technologists from architecture to the film world got busy finding uses for it. We are now seeing the fruit of that licensing deal.
At this year’s GDC the two companies brought their newest technologies together to produce an impressive demo of live augmented reality in a short film for Chevrolet called Human Race. Angus Kneale from The Mill picks up the story: “We’ve layered up so many different technologies together to pull it off in real time on stage. There’s been talk of using game engines to do our type of work but it had never been demonstrated to such a level before.”
The result, live on stage, was a reskinning of The Mill’s BLACKBIRD mule vehicle – search YouTube for the event. There was also a demo of live recolouring of Chevrolet’s cars as a consumer customisation tool. But this wasn’t as new as it seems. UK company Ncam Technology had shown a similar demo at NAB last year with its camera tracking and depth mapping technology. Similar effect but wildly different tracking methods.
Nic Hatch from Ncam explains: “The BLACKBIRD is obviously a physical ‘mule’ if you like. That gets replaced by a virtual car and typically you’ll get a previs real-time CG through the lens as you’re shooting which is great because it enables you to compose your framing which is what it’s all about really. Then you’ll finish it in post, which is the typical workflow, because it won’t be good enough in real time.”
But then a more efficient game engine can take that to another level. “What we’re saying, what The Mill is saying and Epic is saying – as well as, probably most post production houses who are using Unreal are saying – is,” says Nic, “wouldn’t it be nice to do that through the lens with VFX that was finished?
“This will start happening in a couple of years, or perhaps sooner. We’re talking to filmmakers at the moment who want to do live visual effects in real time but to get there isn’t a sudden leap. I think we’ll see less tweaking in post-production for standard visual effects so look out for many virtual sets or partial virtual sets or green screen windows.
“You’ll see it in children’s TV, then perhaps lower episodic TV stuff and then that’ll push through to higher end film. Everyone will try to do this and the questions will be around real time on location or on-set. The Unreal Engine is super efficient at crunching polys and assets and obviously there are a few different engines out there like Unity – it’s the convergence of games and film if you like.”
Epic has had its eye on the media industry for a while but it’s never been really right – until now. Angus Kneale says, “Photo realism is very important for The Mill, things have to look real. For the last 25 years we’ve been doing VFX and one of our strong hallmarks is taking CGI elements, things that are generated in computer, but bedding them in to live action so you can’t actually tell the difference. That was always our calling card.
“We do a lot of car content for auto manufacturers and auto brands. What was interesting was we kept on seeing recurring patterns in the market where a vehicle would be subtly changed from one model year to the next. They would change the headlights or the tail lights or the wing mirrors or something like that. Rather then reshooting all the content, which would be a huge cost, they would come to us and ask if we could just change and update the car to realise the current model. We’ve been doing that for years but then we started doing bigger and bigger parts of the car until a point when, a couple of years ago, we couldn’t even get the car for the shoot. So we had this great production, great director and amazing budget but no car!” continues Angus. “It was still in validation and the engineering team were still working on it.
“So we ended up using the previous model year of the vehicle with the same wheelbase as the new car and completely reskinning it. So every frame was redone. But it was quite difficult. We had to capture all the streets it drove down. It was a real task. We thought there must be a better way of doing it. That’s where the idea for the BLACKBIRD came from.
“We thought, why don’t we build a custom vehicle that essentially can be a chameleon or mule and it can change its shape and wheelbase so it can get really compact or can get much longer like an SUV, your contact points with the road are
“Then what if we put in an electric motor so we could programme the electronic speed control, which can imitate the performance of a regular petrol driven vehicle. Electric motors are much better performance wise than petrol vehicles so we could mimic a car’s performance. We realised that we had enough ideas to go out and build a car from the ground up.
“It took us a couple of years to do it with layers of stabilisation for the cameras,” explains Angus. “We built it with a Californian company called Performance Filmworks; they provided the camera stabilisation system and another company was commissioned to build the vehicle for us from our design. That was the BLACKBIRD which was going after car design automotive content and all that kind of stuff.
“Then we thought, if we take this footage from the four different cameras on the roof of the vehicle and stitched them together but not in post because that would take weeks of rendering. What if you send it to the vehicle that’s filming the BLACKBIRD, you stitch it together in real time, then you use that to reflect in to the vehicle. We then covered the BLACKBIRD with all these QR markers that allowed us to do very high-end augmented reality. Combining the two different technologies together pushed it above the norm.
“One of the biggest problems is dealing with differing lighting conditions. You never know if the vehicle is going through the shadow of trees, turning a corner and going into shadow… Luckily on the professional camera side you’ve got companies like RED and ARRI who have been pouring tonnes of resources into digital cameras to try and match the same dynamic range that film used to have. We looked at all these different cameras in terms of GoPros and whatever was available in terms of VR and we realised that they didn’t have the dynamic range we needed to capture the deepest shadows to the brightest point. We ended up getting four RED Epic 6K cameras together. Obviously we did loads of testing on lenses in terms of which ones we could use to produce this fisheye effect where you’re capturing as much of the image as possible.
“You stitch all those four cameras together and that technique gave us enough information in terms of dynamic range and brightness that we could then reflect back in to the vehicle whether it’s going through a tunnel or shadow or sunlight.
“So as you are driving down the road, and you are able to capture everything around the car. With the stabilised head all your reflections are pin sharp. If you had any vibrations, the reflections would be blurry then sharp and then blurry again.
“We also put a LIDAR scanner – a laser scanner – in the middle of the camera cluster,” continues Angus. “That’s continuously running with time code so we can connect it to the footage that’s been shot directly beneath it. This is like sending out point data to figure out where things are in 3D space. It creates a point cloud for us while measuring about one million points per second. You then join the dots and essentially create a mesh. We can then project the footage on to the mesh.
“It gave us a lot of information, and with VFX, the more information you have the better. The reason we have so many QR codes on BLACKBIRD is that in automotive advertising you sometimes want to romance the front wheel or look at the headlights or you want to look at a very detailed part of the car and you want to compose it in an unusual way. So we realised that we wouldn’t probably see all the car in shots like that so we put a marker behind the wheels, in the front, at the sides, in all different places.”
With BLACKBIRD you can also adjust the wheelbase. You simply type the length you want in within the four-foot travel, then the whole car expands as you’re watching. You can also change the width, which is the track that only expands by a few inches.
The Mill initially thought they’d considered all the configurations that could possibly be needed within BLACKBIRD but they had forgotten something. “One of the first things that came up was that we couldn’t do full-size SUVs and full-size trucks – those axles are so far apart – so we’re building another one, Mule 2,” explains Angus.
“The Epic guys promised us features in the software that would enable us to do what we needed. They opened up their programming to give us what we needed.
“They did massive improvements to the code to allow us, for example, to put in the background plates. All those plates plus the CGI on top were actually going through the engine. Loading that in to the engine at 24fps at full EXR resolution – that’s very big data. The hardware involved isn’t niche supercomputers, but off-the-shelf Intel-based architecture.”
Game engines by virtue of their hugely efficient coding are also being used for virtual set design. Look out for a company called Zero Density which uses Unreal for its new product; it is stunning. Also check out Norwegian company Future Group with its virtual set driven by Unreal called Frontier.