An extension of ‘bullet time’ featured 96 still cameras and two 4K JVC video cameras in a 300˚ arc to help capture the auditions for Sky’s popular dance show.
Television producers are always looking for that magic shot that will raise their show above others. Thanks to EVS and others slow motion is now instantly available but the desire to bend time further has driven the Got To Dance show from Sky HD to evolve the famous Matrix ‘bullet time’ sequence with a little help from One Ten Productions.
This month the auditions for the live shows of Got To Dance were being held on Clapham Common in London as an OB in a huge temporary dome. The scene is a normal TV studio (see picture above) except for the ring of cameras in front and the side of the main stage where the auditions happen. This is the Time Freeze set-up where 96 Canon 1100D DSLRs are laser measured around a 270˚ arc to fit with each frame across a four second record. The jpeg image coming from the DSLR is 4272 x 2848 pixels and the video coming from the JVC GY-HMQ10 is 3840x2160. The JVC’s are there to bring the programme in to the Time Freeze segment as an in point. The sequence is triggered by an assistant in the audience. The second JVC 4K takes the programme back to the main Arri Alexa feed as a video mix. There is also a RED Epic wedged in between the Canons at the front of the stage to allow for 120fps slow motion.
The stills are matched with the 4k video from the JVCs and then eventually zoomed in digitally to a 1080p signal in preparation for the edit.
All DSLRs are prepared with the same exposure and focus settings and are angled looking upwards with identical wide angle lenses to allow a more dramatic perspective. The production company sets up the DSLRs in banks of eight with same exposure, aperture and colour temperature using proprietary software. The positional work out is that between each camera you have one frame. It works out as nearly the interocular distance so another thought is to use couples of cameras to produce 3D images. Although lining up is even more important then and also you would probably need more cameras. You might then have problems seeing too much of the set.
To initially line the cameras up they do an optical view line up through the lenses with nothing more than a stick with an LED on the top of it. That is seen on a laptop with a cross hair overlay and then optically matched before the digital nudging in After Effects.
The resulting 4k video and image sequence from the still cameras is then sent to an edit station outside of the dome and rendered together live in After Effects which also applies automatic nudging to make sure the images match up perfectly. There are scripts and macros applies to put the takes in to different folders as every act gets their own Freeze.
They then add some flicker reduction to make the images scan properly. The whole process takes about seven-seconds which has given the producers the option to use Time Freeze on the live shows next March. Because of the effect I think that this is something they will definitely do, it works brilliantly although because of the length of the render and matching up sound it will probably be used as a review system for the judges.
The thinking for next year is to have a 360˚ rig but whatever they do the effect is perfect for this kind of dance show as it shows the dancer’s extensions and shapes. The challenge for One Ten is to bring the render time down even further so it can be figured in to a live show not just as a judge’s review tool.