We are currently working towards a pipeline for users to be able to bring their mocap data into Unreal Engine and display it in the UTS Data Arena. Our current setup allows users to specify how many markers they have, and once the user has formatted the data correctly it can be brought into unreal and our blueprint will display it accordingly.
For this demo, we made use of four markers (two for hands and two for feet) and recorded the mocap into a .csv file. The results can be seen below:
Our data was split into four sets of XYZ data, with additional information for frames and time passed
Which we then import into a prepared data struct in Unreal. Setting up your data struct correctly will be one of the most important steps in this process. You can see our setup and what the input looks like below
We set our table in the level blueprint attached to the Event BeginPlay. We then set the length to the total number of rows from this input, eliminating the need for manual input on this front.
The image below is INTERACTIVE! It's the level blueprint for this mocap visualisation.
Click the Fullscreen button for a better view:
|Controls for Blueprint (above)|
|Left Mouse Button||Select/Drag|
|Right Mouse Button||Pan|
Our level blueprint works by first having the user set the data struct they want to use for their visualisation. We then find the total number of rows and loop through the number of rows until it reaches the total number of xyz coordinates from our marker variable (determined by the user based on the data they recorded). We set a SpawnActor to bring in our sphere object with the trails for visualisation. We create an array to store our incoming data row by row which is added to our CurrentList variable and set as our X, Y and Z coordinates.
Moving on to the Event Tick function, we use our TotalRows variable to step through our data and move the MocapActor (the spheres) to their coordinates.
We are currently working with a data set where we recorded a drum kit and the movements a drummer makes while playing. Our previous work in getting trails and a blueprint setup has given us a pathway to be able to import this data.
The following video is a WIP and will be updated as our visualisation progresses. Please check back for updates to see the improvements we make over time! This latest update is from Friday 10th July and shows our new materials and lighting to better demonstrate the movement of the data
And a second take from the same project:
This data was recorded in a live drum session with mocap markers, you can see some information about the process here:
View this post on Instagram
Behind the scenes look at the making of the new video for ‘Nero’. Link in bio to see both the linear and interactive versions. • • #Repost @clemenshabicht with @get_repost ・・・ Behind the scenes making Nero for Laurence Pike, link to video in bio. Extra thanks to Thomas.Ricciardiello and the UTS Data Arena for their generous support, and as always collider for pulling it off remotely during lock down. #utsdataarena, #laurenzpike, @laurenzpike, @theleaflabel, @collider__ • • #laurencepike #clemenshabicht #musicvideo #behindthescenes #motioncapture #drummer #drums #drumming