13 Oct 2021 18:10
Virtual Production in the Data Arena
Combining real and virtual worlds with Unreal Engine
In early June 2021 — before Sydney's four month Covid Lockdown — students from the UTS Animal Logic Academy kicked-off the Data Arena's first series of live Virtual Production experiments. Virtual production is the combination of live-action video and large-screen computer graphics which respond to the camera's moves in realtime. The realtime position and motion of a live video camera is tracked in 3D. Large-screen computer graphics provide a digital background, set behind a live actor. This background is transformed to counter-match the camera's move. Done correctly, the actor appears to be standing in front of real virtual world. The computer graphics are locked to the video camera's moves. This live in-camera technique replaces the traditional green-screen / chroma-key / background replacement / rotoscope techniques typically produced offline as a complex post-production process. The benefits are immediate. The director can see the result on-set in camera. The display can even light the actor. There's no green-screen light spill on actors.
Virtual Production is a rapidly advancing field pioneered by the production process of The Mandalorian.
While not quite at the same production scale as Industrial Light & Magic (ILM), these ALA students quickly developed scenes in Unreal Engine. They worked with us to feed the motions and rotations of their real camera in the Data Arena to a virtual camera in the digital scene via our Motion Capture system. Their experiments are simple prototypes to understand how viable this sort of production environment might be for future projects. We were excited to have our first Virtual Production tests up and running within an hour.