29 Oct 2020
Live interaction with our Motion Capture trackers
Unreal Engine 4 is running in the UTS Data Arena. Throughout 2020 we have developed a custom version of the engine to run on our Linux cluster in stereoscopic vision using the Unreal Engine nDisplay module. We now have a stable and thoroughly tested implementation of the engine that supports our input and navigation devices, including our 360º motion capture system. If you have a project running in Unreal Engine 4 or want to create one for the Data Arena, get started here.
Useful assets to help get you started or prepare an existing project.
Setting up projects for the Data Arena and nDisplay in general is simple, but our Hello World project will give you a headstart. This project is already configured with nDisplay, our latest configuration file, and a few of the assets from our UTS Data Arena plugin. You can fly through this scene with the Space Navigator and use up to 8 Optitrack "trackers" to move objects around in real-time. If you're nearby UTS and want to see this project in person, just let us know. If you already have a project running, download the Data Arena plugin to take full advantage of our DA hardware.
To help you set up an existing project for the Data Arena, we've created a plugin of helpful assets. These include Blueprint Classes to access live data from our 360º Motive Optitrack motion capture system and interface with our Space Navigator joystick. The included TrackerCursor class will let you easily add a motion capture controller “tracker" cursor on screen with functions for dynamic object selection, translation, and rotation etc, and the SpaceNavCamera class is a drag-and-drop camera system that uses all six degrees of freedom of the Space Navigator device.