Year | 2020 |
Credits | Thomas RicciardielloDA Software DeveloperProject Development Ben SimonsTechnical DirectorTechnical Lead Darren LeeDA Software DeveloperTechnical Assistance |
3D Stereo | Yes |
Tags | QR code architecture archivis interior-design multi-user realtime unreal-engine |
The Cosy Living Room project is our Unreal Engine 4 debut after successfully modifying the engine to run on our Linux cluster with the nDisplay module. This project was initially built to test a Rhino → Unreal Engine pipeline, and since has become fully integrated into the Data Arena with our external peripheries and input devices. Up to 8 users can each have a motion-capture controlled cursor in the room and collaborate on new interior layouts by moving lighting and furniture.
The Living Room model was originally sourced from CG Trader (since made unavailable) in the .3DM
format for modelling software Rhino, commonly used amongst architects and interior designers. We wanted to explore a pipeline from Rhino to Unreal Engine 4 to make the process of model visualisation in the Data Arena easier for our students and staff. Thankfully this was a simple FBX
export from Rhino and basic import into the Unreal Engine project. While nearly everything was in place to begin with, some materials didn't quite translate accurately (e.g window and object glass, fabrics and wood) and needed some tweaking afterwards.
Best Demo Award:
ACM SUI '23
We built a small version of the Cosy Living Room to show at the international conference: ACM Spatial User Interaction (SUI) 2023. It ran on 1 server with 2 large LG TV screens (which display passive 3D Stereo).
Published in the ACM'23 procedings Cosy won the Best Demo Award! 🎉
A few WIP shots during our lighting development. You can find more detail about the process here.
As a (primarily) game engine, UE makes real-time interaction and manipulation of a level fairly straightforward, and the nDisplay module allows us to tap into our VRPN network for access to any number of input devices. We built an interaction system based around our Optitrack motion capture "tracking markers".
A tracking marker is a piece of plastic with 3 balls which glow in the dark (well, infra-red). They're visible to our motion capture system. Each marker has a unique 3-ball arrangement, and is labelled with a marker number 1 to 8. The cameras can tell them apart. Further, each set of 3 balls create a virtual triangle. The orientation of each unique triangle - the way it is held by a user - is meaningful. We infer gestures. Users move the markers in the Theatre, and can see a corresponding Cursor on screen. The number on the cursor matches the number of the marker. That's your mouse.
In the early version of the Living Room project, up to 8 users could have a cursor on screen. This number was increased later by the use of smart-phones (see below). The first version used the Optitrack Motion Capture tracking system. An interesting aspect was it required users to walk around the theatre. To move furniture, you first needed to walk over to where it was on screen to select it. Each tracking marker's screen position was based on where the user stood in the Data Arena. Moving furniture or lights is a simple gesture of pointing at an object on screen, see the object outlined in white, then flipping the tracker to grab it -- but you would need go select it first.
To move furniture to another space, you'd have to physically walk over to where it is on screen, pick it up (with a gesture) and walk it to its new place. The Data Arena is 10 metres in diameter. Sometimes the object you wanted to select was on the other side of the room. Furniture can be placed anywhere in the room at a new angle. You can read more about our process of developing a multi-cursor system here.
Keep in mind this is a 3D stereoscopic display. The objects appear to float in 3D space. The effect is quite immersive. Come see it.
Our SpaceNavigator controls the in-game camera to allow for smooth movement across six degrees of freedom. This operates independently to the tracker-based cursors, so a ninth person could control the view of the Living Room while others work on a new arrangement. If at any time the layout becomes unworkable, a button press on the SpaceNavigator can re-arrange the furniture to its default layout.
In late 2021, we developed DA Live: a new interaction system which provides each user a mouse, using their phone as a trackpad. Many cursors appear on screen, labelled with an anonymous user-number. This can be changed to your name and preferred colour.
Users interact with objects in the scene - at the same time. Click on menus, hover on data points, explore, reveal & download data.
To get started users scan a QR code with their phone. This enables a trackpad on the phone & pops a cursor on the Data Arena Screen. The QR code changes every session, to restrict access.
With the Cosy Living Room, every user can move furniture around the room. Users hover to select an object, tap to pickup, then lift, move, rotate and drop objects to work collaboratively on furniture arrangements.
DA Live leads to Group Tasks - for example - visiting students are asked: Please move all the plants into the sunlight. Some furniture needs to be moved out of the way. It is up to the group to figure-out how to achieve the mission. It's possible for several people to co-operate on a new position for the lounge. Together they select, lift, move and turn to find its new place.
The interaction interface is hosted on an external web server, which provides simple instant access. There's no need to authenticate with the local network, there's no need for a username & password. There's no app to download. The Dynamic QR Code ensures unique a connection each session.
This project is a milestone for multi-user experience in the Data Arena and represents our intentions to move into spaces of architectural visualisation, interior design and real-time "Digital Twin" model production with Unreal Engine.
It's a tech demo - here's how 8 (or more) people can interact & collaborate with objects in a shared Virtual Reality. You can use this for your project. The 8-mouse (mocap) input system is part of the Data Arena's Unreal Engine Plugins. The QR-Code Smartphone Mode enables more than 8 users (essentially everyone in the theatre).
Download the plugins for your Unreal Engine Editor. Let's create collaborative visualisation projects. This demo was built on a Macbook Pro Laptop.
Unreal Engine 4.25.4 is up and running in the Data Arena and we invite you to share your Unreal Engine projects with us and try them in the Data Arena. You can follow our ongoing development with Unreal Engine and download materials to help get your project started here.
Since 2023 the Data Arena has run Unreal version 5.