Markerless Tracking Technology Brings Down Cost Of VR

zlense claim that their rig can be used by most cameras.

zlense claim that their rig can be used by most cameras.

zLense has launched a depth-mapping camera solution that captures 3D data and scenery in real-time and adds a 3D layer. The technology processes space information, making  new and real three-dimensional compositing methods possible, enabling production teams to create 3D effects and use CGI in live TV or pre-recorded transmissions - with no special studio set up.

The system is similar to one that was introduced last year at IBC by FilmLight. Ncam, a company backed partly by FilmLight was demonstrating real-time visualisation of immersive 3D/CG graphics content in-camera while shooting at IBC on FilmLight’s stand.

Using its markerless camera tracking technology, Ncam opens up new possibilities in immersive graphics. Environments that were previously impossible to track are now accessible, enabling quicker set-ups and almost instant tracking anywhere, in studio or on location. The Ncam technology is available as a turn-key solution.

The NCAM system provides low latency realtime visualisation, making it ideal for 2D and 3D virtual sets, and for animated characters, as well as for the insertion of logos and graphics. It can also be used with blue or green screen through the use of its own keyer. It provides support for third-party graphics engines such as Vizrt through the integrated Free-D protocol for broadcast augmented reality, and also includes a Motion Builder plug-in option for seamless FBX file exchange.

zLense in practice.

zLense in practice.

zLense

zLense says Directors can produce simulated and augmented-reality worlds, generating and combining virtual reality (VR) and augmented (AR) effects in live studio or outside broadcast transmissions. The depth-sensing technology allows for 360˚ freedom of camera movement. Directors can combine dolly, jib arm and handheld shots as presenters move within, interact with and control the virtual environment.

 “We’re poised to shake up the Virtual Studio world by putting affordable high-quality real-time CGI into the hands of broadcasters,” said Bruno Gyorgy, President of zLense. 

“With minimal expense and no special studio modifications, local and regional TV channels can use this technology to enhance their news and weather graphics programmes – unleashing live augmented-reality, interactive simulations and visualisations that make the delivery of infographics exciting, enticing and totally immersive for viewers,” he continued.

 The zLense Virtual Production platform combines depth-sensing technology and image-processing in a standalone camera rig that captures the 3D scene and camera movement. The ‘matte box’ sensor unit removes the need for external tracking devices or markers, while the platform’s built-in rendering engine looks to cut the cost and complexity of using visual effects in live and pre-recorded TV productions. The zLense Virtual Production platform can be used alongside other pre-exisiting, rendering engines, VR systems and tracking technologies.

The NCAM system was at last year's and this year's IBC Convention.

The NCAM system was at last year's and this year's IBC Convention.

NCAM

The NCAM system can record tracking and lens metadata to inform the downstream post process, and it can be fitted to any film or broadcast camera configuration, whether dolly, crane, handheld or Steadicam and it works with spherical or anamorphic, prime and zoom lenses. Set-up time is minimal ensuring that there is no delay to the production and no changes to the workflow. Any number of cameras can be used simultaneously without interference.

The huge advantages of using Ncam on set have been proven on many feature film productions in the last year, including: White House Down, Edge of Tomorrow, Muppets Most Wanted, Jupiter Ascending and Our Robot Overlords.  Existing clients include Warner Bros, Sony, Disney and Fox.

It quickly became a relied-upon production tool for Roland Emmerich, director of White House Down, who commented, “Normally you can’t pre-vis Steadicam shots because the Steadicam operator and the actor do a dance. You have to see it — and with Ncam I could see it, in real-time, with real people in frame. Ncam is now my favourite tool.” 

Ncam Technologies is a collaborative joint venture with FilmLight Limited.  Ncam’s headquarters are based in London with offices in the US.

www.zlense.com

More about NCAM

Posted on October 29, 2014 and filed under audio, mocap, post, vfx.