Build a Virtual Reality World to do an SAP research survey?
As someone who doesn’t go to conferences and events, I hate it when someone tries to make me answer a questionnaire. Can’t stand it. But when I first saw VR technology being demoed, I had to have a go. Looking back, I remember seeing the HTC Vive being demoed in GAME and I signed up to the form and entered my details and queued up.
Essentially in our VR world you’re filling out a questionnaire. But our questionnaire isn’t on some boring pad and paper waiting to be filled in, it’s in a Virtual Reality game in which you get to play in. I can confidently assume that a lot of people who attend the SAP UKISUG Connect event will remember us more vividly than most other stands, simply because they had a go a VR or they simply watched.
Why Unreal Engine 4?
The tool we used to create the Virtual Research Centre is Unreal Engine 4 (UE4).
UE4 has been used in media marvels that have stormed the world with success. Fortnite is a game created by Epic Games using UE4 and is a trending phenomenon, which has amassed more than 78 million players worldwide. UE4 has also been used in the Star Wars franchise for rendering some scene shots in Rogue One.
Knowing that I would working with my colleague Greg Ellis, who would be focusing on the API side of the programming, UE4 offered open and easy integration with the use of third party plug-ins. Having been using this software since it was first released in 2014, I knew that UE4 would be the exact tool that I needed to develop this virtual concept into a Virtual Reality.
The Whole Process
From start to finish, the whole project was a 3 month process. A quick note to mention is that, throughout this project there was constant iteration on design to better improve usability, based on feedback from a variety of people.
The process starts with the storyboard.
A storyboard is a bird’s eye view layout of the whole environment, which people would be able to explore and play. In this layout, I would label where scripted events and player interactions would occur, as well as other information. This is to show the flow of the simulation and give a rough idea of what happens and when.
Once the storyboard is complete, I then moved onto ‘whiteboxing’ the environment. Whiteboxing is a term in the games development industry, which means to create a 3D layout of the environment with basic shapes, mainly consisting of white boxes. Doing this allows me to programme a player character and place it into the environment to play with, therefore allowing me to actually get into the game.
Usually when you develop games, getting a character into the level helps you judge the scale of thing a lot better than just using measurements. But when putting yourself into a VR World it feels a lot different. I’d be standing in the room and be able to examine the world around me. Personally it felt a lot more natural way to examine and judge things. For example, I measured and made a box room to the measurements of our office, but inside the VR world it looked a lot smaller than reality.
So I knew to make bigger rooms, to get that wow factor.
Moving on from Whiteboxing, I then made the first section of the space station with 3D assets, and started with the programming aspect of the project. I made the different UIs for menus, such as the questionnaire and keypad, and designing how a player would interact with a menu of any kind.
Originally the plan was to have a player touch an object with their virtual hands, which were controlled by the motion controllers, and have a menu pop up in front of the player and attach itself to the player’s camera, so it would be in the player’s face until they answered the question. This seemed rather intrustive, so I modified it so that menus would pop up above the object, that the player had to interact with to access the menu.
Whilst I was working on the design and accessibility of the UI, Greg was putting together the API. This API would be what my UI would communicate with live, whilst the player is playing in order to distinguish who they are, so that the API can send the correct data back - such as the questions and statistics.
Once the first phase of the core mechanics was complete, I built the rest of the main infrastructure of the space station with the 3D assets. During this part of the process, I made iterations to my original designs, to incorporate the dimensions of the assets and any other design changes that I decided upon.
One of these design changes was the incorporation of the ICC. Originally the room that held the grand finale was going to be a regular presentation hall - imagine a lite version of a TED talk. But we discussed how cool it would be to suddenly end up in the ICC, at the Resulting stand no less.
Loving this idea, I obtained a layout of the building to use as a guide, and I built my rendition of Hall 3 of the ICC in my Virtual Research Facility.
Now that I had the main infrastructure on the environment built, I could build and implement the scripted events. These scripted events would happen for players as they progress through the simulation. The main scripted event was the grand finale.- when a player had answered all of the questions, the AI character would malfunction and the virtual world around them would glitch out. Stalls would get sucked out into space, the roof would pull apart, the walls around you would disintegrate rapidly, and then you were dropped into the abyss of space.
During this time, I had a recording session with our very own Derek Prior. Derek is the face and voice actor of the AI on board the Resulting Virtual Research Facility. We did a recording session of all the voice over, and we did some filming with a green screen. Once we had these recordings, I asked Ross to apply some special effects.
Also whilst working on this, we hired a sound engineer to work on creating music and sound effects. A worthwhile mention to add here is that, being able to walk around and interact with an environment can be awesome, but once you add sound into the equation you’ve entered onto another level of awesomeness.
Finally the first draft, essentially a prototype, of the project was complete.
This is when I started getting people to help with the Quality Assurance. Quality Assurance is a whole process of its own. Its purpose is to find out bugs in your programming, flaws in your design, and just to see how someone else plays in your world. When you’re building a project, you know how everything works, so you naturally know what to do.
That’s great and all, but if you’re not the only person who’s going to use this, then you need people to test it out to see if they come across any problems. So I got a variety of people to do some Quality Assurance for me, and I made notes of many changes that needed to take place, which is a good thing.
Nothing is perfect the first time round.
After a few rounds of Quality Assurance and making the tweaks, the SAP Virtual Research Facility was complete.
What Skills Are Needed To Make an SAP Virtual Research Facility?
Not a question I get asked daily, I will admit. But here are the skills needed to build a SAP Virtual Research Facility.
Sure anyone can think of a creative idea, but what I’m talking about is more on the matter of creatively thinking about how you’re going to create your idea. Being able to successfully plan and display the workflow of your idea generation, is key to building anything successfully.
For example, in a project like this, a lot of code will communicate with in-game objects and events.
The practical goal in this virtual world is to get players to answer all of the questions. How do I guide players from question to question, start to finish, effectively? I learned a lot about this during my Masters Degree, as discovering what entices a player to explore an environment was my key learning initiative.
Outside of this virtual world, flow control is used everywhere public. Airports, bus stations, event buildings, towns, cities etc. They all guide people from A to B and everywhere in between.
Knowing how to guide a person is essential to making the project streamlined and effective. Bad flow control could result in a person getting lost and not completing your questionnaire, thus leading to incomplete data gathered. Good flow control often results in people doing everything desired, and in good time as well.
Being Analytical & Frequent Testing:
Being analytical is key when building anything in a virtual world, or with code. You need to be spot on when setting things up and implementing functions. Something will go wrong initially, it may be a small problem, but without being analytical you won’t find why it’s gone wrong, which is the issue.
Frequently testing your work and being analytical as you do it, ensures that you will find all of your issues sooner and be able to fix them now, before finding them later and potentially having to redo a lot of work.
Asking yourself the questions of why should I do it this way? How will this work with everything else? Is there a more effective way of doing this? These questions will allow you to properly analyse your methods and thinking, to help you decide on the best solution.
Being analytical is a timesaver, stress saver, and it’s cost effective.
The Mastermind Behind the API
For some extra context, you can read more about why we did all of this on the main article by our MD, Stuart Browne, over here: The making of our UKISUG Virtual Research Centre.
If you would like to read more about how and why the API was created, our programmer Greg Ellis elaborates on that here: For my next trick - Cloud Graph Databases, APIs and Virtual Reality.