WEEK COMMENCING 27 JAN – ALL ROADS LEAD TO BIRMINGHAM

The ongoing work for Highways England and their desire to train Traffic Officers in an immersive environment takes centre stage for much of our working time, but there’s plenty going on. Not least is our search for new talent to join our team and decisions about where they’ll sit. Here’s our week.

Toby – I’m in Charge Here

I split my week, currently, between a client office in Bristol and our own in West London.

There’s all sorts going on this week; discussions around a forthcoming GIS-based (Geographic Information System) Augmented Reality solution, ongoing conversations with a university regarding a tie-up on our Innovation Centre, another university about a collaborative project sponsored by a shared client, an exciting introduction and conversation with the Head of Corporate Events within an existing customer.

And to end the week with more practical stuff, I had to consider how best to accommodate new team members into our existing London office space and how to improve the organisation of all our equipment onsite (and off). Then we got onto the tricky subject of desks; should we have a single desk for the team, or a series of desks joined together. Big questions…

Also, we did some follow-up work on the issue of “VR done badly can make you feel sick”, testing our solutions to check. The problem is that those individuals who experience immersive environments regularly, get used to it.

This makes our team useless as testers – even if they tried, they wouldn’t get sick. So, we need to use guinea pigs (not the fluffy pet) and I’m game to test most stuff in the office (see Teslasuit “Electrocution”) but what I excel most at is nausea testing. I get nauseous a lot; car journeys, hangovers, anaemic tea, etc.

Josh – Programme Manager

My week involved further work on our Eye Tracking Study. We undertook a two day test at the Highways England offices in Birmingham, looking at how people react to health and safety information, how much they retain and how its presentation can be optimised to ensure the most critical information can be recalled quickly when it’s needed.

The eye tracking study focuses on what people look at, how often they look at it and for how long. With this information we create heat maps and gaze plots, as well as undertaking traditional statistical analysis to attempt to answer the question: ‘has a person remembered what they’ve looked at?’ The test was ok, but it kinda broke on the second day. Actually, it definitely broke so I’ll be back next week.

My week also involved scenario planning, where we detailed and planned two VR scenarios for an upcoming customer experience solution. These involved disabled road users and Traffic Officers’ instructions relating to helping people off the hard shoulder. This project will involve more contact with equality actions groups to ensure they are happy with the approach being used.

Our projects are typically undertaken for government organisations and as such require input and planning from a variety of stakeholders. These organisations also need rigid dates for completion so they can all plan their relative parts.

However, at MXTreality, a lot of the work we do, not only have we never done it before, but often the specifics of that work may not have been done by anyone before, which makes forecasting delivery part of the challenge.

I try to mix our agile way of working with their upfront planning methodologies (called waterfall) to ensure both us and the client are happy. In practise this means being painfully honest about what

you know and what you don’t, and putting safeguards or boundaries around those unknown elements, so everything is a little less risky.

Spent Thursday afternoon looking at animal behaviours with Stefano, one of our artists and we’ve discovered horses are subtle, emotionless, characters and anyone who says different is a liar.

Slava – Lead 3D Artist

I spent a lot of time this week making new textures for character faces. The aim was to create something more stylised and to avoid the so called ‘Uncanny Valley’ phenomenon – human appearance makes an artificial figure seem more familiar, but becomes unsettling for viewers once the artificial figure tries but fails to mimic a realistic human.

Eventually I made the face look more like a painted portrait, with a couple of variations, such as shaved and bearded face in case we need different types.

I was also making new model of Mini Cooper car with authentic interior to use in Unreal Engine. This work included the quite tricky fitting of the interior from a highly detailed model into the game-ready car.

Stefano – 3D Artist

This week I continued preparing different characters for animation. I modified the shapes and textures of existing ones, tweaking or even redoing the skinning of the skeleton, which most of the times get ruined when undertaking these modifications.

Then I spent time creating different facial blend-shapes for characters to be used within the ongoing and future projects. In parallel, I’ve learned how to correctly use them in Unity’s Lipsync plugin.

I have been working with the Maya file of a rigged and animated horse. It needs a lot of tweaks on skinning as the bones are not moving the different body parts smoothly and realistically enough.

I’m carefully watching a lot of reference material of horse skeletons, motion, behaviour, emotions and how these are expressed.

A lot of work needs to be done on the muzzle also, since it will be a very important part of the animation. Eyelids, lips and nose must be perfectly rigged and skinned to be able to mimic the complex horse muzzle expressions.

The horse is one of those animals encountered regularly by Traffic Officers and our accurate animations, used in an immersive environment will help train them to cope better with excited horses – which can be dangerous to road users and those tasked with capturing them!

Cat – Lead Programmer

This week I’ve been continuing to develop our shader-based system for arbitrarily curving roads. It’s now at a stage that the user can sit in a car and see a curved motorway scroll past them, but as of yet the curvature is constant, so it feels like you’re going in a big circle.

The next stage of development is to be able to assemble roads with sections of varying curvature. This is a somewhat tricky problem because it involves composing several sets of equations in sequence to render parts of the road that are further away.

To render a section of road in the distance you have to take into account the length and curvature of each intermediate section in order, otherwise distant sections will appear disjointed and not very realistic at all!

Sergio – Programmer

UI / UX Final Implementation

For the past couple of weeks, I have been continuously sketching and prototyping different approaches for the VR Interfaces that our current solution would require. Last week I implemented our final design that appears to tick all the necessary boxes.

The design encapsulates elements of my previous research, such as a curved interface and minimal navigation steps. Curved interfaces provide an equal distance from the user to visualize the content while minimizing the interactions, which decreases the learning curve.

One of the problems that we encountered with spatial user interfaces in VR was the unstable collision detection between triggers and buttons. This issue happens because of the floating-point of precision that the game engines provide for physical interactions.

The values can be unpredictable and create unwanted results. To solve this problem, we opted for using ray casting, which is commonly used in 3D computer graphics for things like creating projectile paths, intersection tests and bounds tracing. By casting a ray from an origin to the desired direction, we can define the limits of the triggers without using geometry colliders.

To render the text on the geometry, we utilised a type of texture (Render Texture) that is rendered through the camera viewport from a 2D canvas. We can then easily manipulate the text size, font and colour in real-time. This is very helpful as our current solution contains multiple lines of dialogue that the user can choose from during the experience, without necessarily changing the geometry or any imported texture.

And that’s it for another busy week.