20 Apr 2022 14:04

Virtual Production Experiments in Exploring Mawson

After a successful shoot on the Exploring Mawson set in March 2022, we worked with UTS MAP Lecturer, Greg Ferris, to create an alternate Virtual Production style camera system, using the Data Arena's Motion Capture setup. As explained in detail on this page, we applied the same process from an earlier experiment with the UTS Animal Logic Academy to the virtual camera inside the Exploring Mawson Unreal Engine project.

One of our MoCap Trackers was attached to a SONY FX6, and the real-world movements of this rig controlled the movements of the virtual camera to simulate realistic object parallax and actor placement in the world. In the videos below, you can see the output from the Sony FX6, as well as an iPhone contextual recording from Thomas.

The results from this very quick test (roughly 30 minutes setup time) are very promising. Lots of adjustments could be made to further improve the mapping between real-world and virtual-world movement, as well as camera adjustments such as field of view, motion blur etc.