BUILDING OUR VIRTUAL REALITY DRIVING SIMULATOR GATHERS PACE

As the team began their phased return to the office, they discovered we have a new, brighter, larger one which has become entirely necessary. Interest in our work creating immersive experiences is attracting many new admirers who all want innovative virtual reality or augmented reality (or a combination of the two) solutions to some old training challenges. This was our week.

Week commencing 13th July

Toby – Managing Director

With a few of our solutions nearing their releases it was pretty busy around W4 and obviously we added to the to-do list by moving into our new offices as well. We’re still in The Courtyard but need more space so we’ve moved into a bigger room.

Hopefully that’ll mean we can have it all going on; motion capture, the driving simulator rig, and StarTracker – more of that here. Since I’m now pretty much a professional when it comes to sticking StarTracker mirrors to ceilings I carried on at the office and now our new space looks suitably spotty when you glance skywards.

These are really exciting times for MXTreality and I look forward to publishing more details in the coming weeks.

Josh – Programme Manager

On holiday this week so no diary. But still aware I’m getting increasingly busy as I continue to lead the team on all the projects. So many projects.

Cat Flynn – Lead Programmer

I started this week by working on a VR orientation experience with Kyung-Min. Taking inspiration from Richie’s Plank Experience, we want to induce vertigo (or more accurately acrophobia) in our players (mean, I know) to show them some of what VR is capable of.

Working with the beautiful environment Kyung-Min’s made, we’re using this as an opportunity to learn about Unity’s Universal Render Pipeline. My job is to implement a trigger system to allow the world to react to the players’ movements, as well as the players’ ability to fall to their demise and respawn.

Unfortunately, early in the week I suffered a hardware failure and had to go to the office to replace my headset. Notwithstanding the cautious commute to Chiswick, it was great to see Stefano, Jordi and Toby in person again and recently we’ve moved to another building, so seeing the new office was quite exciting.

I’m looking forward to moving back in and working there full time after so long at home! While there, I got to see the physical driving simulator setup with a steering wheel and pedals – now in Jordi’s capable hands. It’s coming together, a little less slap-dash than the setup I had at home.

The rest of my week has been spent developing the user interface for our ‘Animals on the Network’ solution. Initially we wanted to try reusing the interface in Traffic Officer VR, but this proved to be not quite fit for purpose – as lovely as it is, this project has different UI requirements.

So, I split up our old interface and pulled out the buttons, which I then used to build a simpler interface for displaying the animal behaviours and playing their relevant animations. Hopefully by splitting the interface into smaller components we’ll be able to re-use them frequently in other projects, while maintaining the distinctive interaction and visual style we’ve achieved so far.

Sergio – Programmer

My focus for last week was to finish the animation root motion controller for our ‘Animals on the Network’ project. To be precise, a horse transition animation from which its idle animation changes to run animation by following a path on the road.

The difference between a root motion and normal animation is simply that in root motion the animations control the position of the animated object in the world, unlike typical animations where the movement is controlled by code.

The main advantages are that it follows the exact movement curve and looks more realistic with minimal ground sliding effect. The implementation consisted of manually scripting the logic with data directly gathered from the animation and then based on a Bezier curve path, apply a rotation to align the animal’s forward direction.

Stefano – 3D Artist

This week I joined Sergio in the study of A-frame, a web framework for building VR/AR experiences that work in the browser without the need to install anything – which is key to many of future projects in discussion with clients.

I had to face all the limitations imposed by this method and learn the right way to export the assets from the 3D package to make them work effectively.

One of the issues was limiting the maximum influence of the skeleton to the mesh. Basically each vertex can be affected by a maximum of 4 different bones. Maya by default starts with a max influence of 5, so in the beginning the animation wasn’t working, then I found solutions to fix it before exporting.

As the next step I would like to concentrate on the materials and texture usage and optimisation. https://github.khronos.org/glTF-Validator/ also helped a lot in finding the issues in the models I was testing, offering hints about the problems.

Near the end of this week I also started testing facial expressions motion capture using an iPhone X. For now I just scratched the surface but I’m sure there are a lot of nice things to discover that will undoubtedly add more realism to the immersive experiences we create.

Slava – Lead 3D Artist

Last week I started a big task. I was working on a 3D model of a motorbike for Unity. This week I completed two versions of a sport motorbike for use in traffic and in accident scenarios. The second version has some signs of a crash, such as broken mirror.

I made unique non-overlapping UV mapping and unique texture for this model in order to have the opportunity to make custom variations for this motorbike in future uses. These variations may include various decals or weathering. I used substance painter to add realistic age to the current model.

During this week I also created animated hazard lights and exhaust fumes for one of our cars. To create exhaust fumes in Unity I needed to use the particle system, which I had the experience of using in other engines, such as Unreal Engine and Big World Engine, but it was my first experience in Unity.

I was happy to find that particles in Unity have all the important features I needed, available through a good practical interface, which helped me deliver what I think are very good results. What do you think?

Kyung-Min – 3D Generalist

This diary marks the shortest week I have ever spent at MXTreality. Having worked remotely since lockdown began, with only Stefano and Toby holding the fort, it was my first time back in the office and I have to say it felt great to be back.

As is customary at MXTreality for there to never be a quiet moment, we find ourselves moving to a bigger office just across the courtyard! During our stand-up meeting, it was announced that painting would commence for our new studio and I immediately offered to help with the painting

Having been away from the office for so long, spending time with some of the team and seeing the new office (and new art, thanks to Toby) felt great – we have a brilliant team here.

During this pandemic, I think we have all changed while learning new things and something I have learned is the importance of helping educate others, while considering the future we leave behind.

There are many lessons to be taught and I believe through the mediums of VR/AR at MXTreality we shall be teaching them, not just for a better, safer working environment or quality of life but a better future too.

Expect great things from us and you will still be surprised.

Jordi Caballol – Programmer

The last weeks have been devoted to the driving simulator. Most work has been done on two fronts: the actual simulation of the car and the force feedback through the steering wheel.

For the simulation of the car, it’s simply a lot of work, as cars have a lot of internal systems (the engine, the transmission, the clutch…) that need to be simulated. Most of these simulations don’t require a very complex code, but the values they use need to be accurate and fine tuning these values is where most of the time is spent – but also where the realism comes from.

For the force feedback the main source of work is the fact it isn’t supported by the Unity engine natively, so to use it I needed to build a native C++ plugin that gives access to all this functionality, creating an interface that allows us to use it easily from the code of the simulator.

The force feedback itself right now is being used to transmit two sensations to the user: the friction of the tires on the ground when they try to steer the wheels while stopped, and the inertia of the car trying to turn the wheels in the direction we are moving.

When we combine all these sensations of real feeling, with the incredibly accurate virtual environments the team are building, there will be few, if any, more realistic driving simulators anywhere in the world – including for F1, let alone for Traffic Officer driver training!