Table of Contents Show
Strapped into a cockpit chair with VR goggles, headphones and other accessories, participants were able to slip into the role of a co-pilot in the space shuttle. Together with virtual pilot Charles Overmyer, the shuttle is supposed to dock with the ISS, but the mission goes differently than planned … asperity-tec.com. The team has come up with many details for the interactive project: In addition to the actual VR experience, a fictitious company called Asperity Technologies Corporation has also been created – including a corporate design (even with postcards), website and image film. The trade fair stand at FMX 2018 was also elaborately designed – and anyone who didn’t run into the FMX beaver could also continue the selfie cascade with an astronaut from Asperity. We spoke to the team to find out details about the implementation in Unity, tools used, workflows employed and more.

DP: For those who didn’t have the chance to try out your project at FMX: What can users expect to find in Asperity?
Lena Lohfink: Asperity is an interactive, cinematic virtual reality experience in which the viewer embarks on an adventurous space journey. At our Asperity Technologies Corporation stand, users can not only learn a lot about the company, but also take a seat in a real replica of a cockpit chair. Then it’s time to put on the VR goggles and headphones, put on the controller glove, grab the joystick and off you go! This interactive 360-degree room installation simulates as real a space flight as possible by exposing the user to additional external, physical stimuli and thus actively involving them in the experience.

DP: What goals have you set yourselves for the project?
Sebastian Ingenfeld: Personally, I’ve always been interested in space travel – both the scientific and the fictional side of science fiction. The iconic space shuttle has always been my favourite, and when I stood in front of the decommissioned “Atlantis” in Florida in 2016, I couldn’t be stopped. The main aim was always to entertain. However, it was also important to me to strike a balance between captivating entertainment and a credible scientific background. The end result should be a piece of VR entertainment that I would have liked to experience myself as a viewer and that wouldn’t leave every scientist scratching their head.

DP: How big was your core team and what additional people did you work with?
Lena Lohfink: The core team consisted of the director, lead technical director and producer. In total, however, around 30 people were involved in the project, including US and Canadian voice actors, German and Slovenian actors and many great artists (programming, animation, sound design, music etc.) from the Baden-Württemberg area.
DP: How much experience did you have with VR & interactivity beforehand?
Sebastian Ingenfeld: I mainly have experience with classic short films. However, I had already worked on an interactive installation at the Film Academy and was able to quickly learn the necessary basics. However, I had to adapt to many new workflows and working methods during creative pre-production. But that was also a lot of fun.

DP: Over what period of time was “Asperity” created?
Lena Lohfink: The pre-production and production of “Asperity” began in October 2017 and ended with the graduation in May 2018. However, the idea and the script were created much earlier, at the end of 2016, and some preparations were already made in spring 2017.
DP: What did your project management look like? Did you use any special tools?
Lena Lohfink: As this was our first VR project, I gathered a lot of information from / with the artists at the beginning and looked for comparable projects. In the resulting production plan, we divided the seven-month production time, including pre-production and project completion, into five major milestones – in our case: user testing. Accordingly, certain elements of the film/game had to be in production at these milestones, so our weekly targets and deadlines were derived from this. Following the methods of agile project management, we defined the various task packages from week to week, distributed them to those responsible and worked on them. The production plan, weekly logs with targets etc. were mainly created in Excel, Google Spreadsheets and InDesign. We also worked with game and flow charts, used handwritten to-do lists to track the task packages and tried to save time through direct communication channels.

DP: How did the planning and work on this differ from previous projects?
Lena Lohfink: In comparison, the task packages of the previous projects were much easier to organise into almost linear processes whose individual tasks were interdependent and interconnected. This means that a certain task can only start once other tasks have reached a certain point or have been completed. With “Asperity”, user testing was the linchpin of almost all task packages and areas. This means that there were an incredible number of individual, independent task areas and therefore construction sites at the same time. Monitoring the work processes was therefore much more complex and the various game elements had to be adjusted and re-evaluated on a weekly basis in the context of the engine capacities and results from the user testing.
DP: Which hardware and software tools were mainly used in the respective project phases? What did your pipeline look like?
Sebastian Ingenfeld: I mainly work in Cinema 4D and modelled or assembled the first assets there – I also created and revised all the UVs there. In general, the interior of the cockpit – which you have in front of you for almost the entire experience – consists of a lot of geometry. The basic textures were created with Substance Painter, but I then mostly reworked them in Photoshop to break up the procedural look. We assembled the experience in Unity 3D and used the SteamVR plug-in to address the HTC Vive. The cockpit including displays, astronaut, effects and sunlight runs in real time. The intercom or video chat connection to Mission Control and the phenomenal view of the Earth are pre-rendered video textures – they now run really well in Unity.

DP: Which file formats did you work with?
Sebastian Ingenfeld: Basically, “Asperity” is a patchwork of many assets from different sources – various artists modelled for us and we bought in assets. For the import to Unity, I have to say that .fbx is wonderful for models and animations. Our video textures run solidly as .mp4/ H.264. We even use a 360-degree video that we map onto a sphere interior.
DP: How many assets did you create for the project and which templates did you use?
Sebastian Ingenfeld: The cockpit alone as a set consists of countless components with thousands of switches. It’s particularly helpful that NASA provides a large number of plans and drawings for free; we mainly stuck to these and basically copied them. However, we had to slightly modify the design of the original NASA space shuttle cockpit for our story and add a little science fiction here and there. Apart from the pilot, we only had to model a bit of space debris and build a believable Earth as a moving 2D matte painting – after all, the shuttle is flying into the sunset. Then there are our “floating props” – basically everyday astronaut objects: a torch, a board book, chewing gum. These props float through the cockpit and can be touched by the user.

DP: Was it possible to use existing assets, e.g. assets provided online by NASA?
Sebastian Ingenfeld: NASA provides free 3D models for download, but these didn’t make it into the game in the end. Due to time constraints, we bought the model of the ISS – NASA also offers a finished model free of charge, but it would have taken a lot of additional time to retopo, which we wanted to save. Philipp Maas derived and texturised a Unity-compatible asset for us. For our matte paintings, we naturally used existing shots from the orbit. We also drew on the NASA archive for our intro and placed our own designs using classic VFX workflows.

DP: What did the realisation of the cockpit and the pilot look like in detail?
Sebastian Ingenfeld: During pre-production, I designed a CI / CD for our fictitious company Asperity Technologies Corporation at an early stage. Inspired by this, our costume designer Marika Moritz then created the astronaut suit in Marvelous Designer and sewed it together virtually – only a little retopo was needed here. Alexander Frey simultaneously sculpted the helmet in ZBrush and then modelled it game-ready in Maya. Our Technical Director Seyed Ahmad Housseini was mainly responsible for a clean character animation pipeline. We had roughly recorded all the necessary movements for the astronaut using motion capture, but our animator Maike Koller still had to animate many details on top. In Maya, the mocap rig was finally merged with the animation rig to create a game rig and then pushed to Unity via .fbx – there we were able to fade from take to take or send the character to idle using our event system.

DP: What interaction options does the user have in the cockpit? How could these be realised with Unity?
Matthias Heim: The user uses a glove with a Vive tracker to control their hand as a space tourist in the cockpit, which was realised with Inverse Kinematic. This allows them to operate switches to progress through the story. In addition, objects float in zero gravity with which the player can interact. Simulated rigid bodies were used for this. To dock to the ISS, the user can control the rotation of the shuttle with a real joystick, whose input Unity was able to read as a normal USB controller.
DP: What animations had to be created for docking with the ISS?
Matthias Heim: For the docking of the space shuttle, the position was animated by hand. The player can control the rotation using a controller. In addition, the displays in the cockpit reflect the position and rotation of the shuttle, which was procedurally animated depending on these values.

DP: Why did you decide in favour of the Unity Engine?
Matthias Heim: As Unity is very widespread, most of the project team had already had contact with this game engine. In addition, the two programmers already had experience from previous Unity projects. The software offered us the opportunity to quickly develop customised tools, such as a node-based event system. This made it easy for non-programmers to create content and edit the storyline. Even though other programmes such as the Unreal Engine offer better graphics out of the box than Unity, this is not necessarily an advantage for VR applications. As all images have to be calculated twice and at a very high frequency, Unity has already taken steps to improve performance for these applications.

DP: How did the exchange and integration of the sound design go – which middleware was used for this in Unity?
Pablo Knupfer: We worked very iteratively in advance and defined important sound events in advance. A large part of the sound design was also created here – in snippets, so to speak – and was then integrated into the sound engine and thus into Unity. Binaural sound was necessary to create the most immersive experience possible and a particularly realistic soundscape. After all, sound sources needed to be localisable. This was implemented using the Wwise sound engine together with the Spatialiser from Oculus.
DP: How did you bring conventional VFX elements into the real-time environment of the game engine?
Sebastian Ingenfeld: I used to do a lot of 2D compositing for live-action film and feel at home in classic VFX workflows. So my original plan was to pre-render elements such as explosions, impacts or space debris and integrate them as video textures. However, when all the hard 3D assets and video textures for the backgrounds were integrated into the Unity scene and the performance and frame rate were still good, we reconsidered this decision. We now use a particle-based library for most effects such as fire, smoke and the visible rocket bursts. Our space debris is fuelled by Unity’s physics system, which also controls our floating props and things like glass shards inside the cockpit. The last remnants of VFX can be found in the view of Earth – a moving matte painting of real footage from Earth orbit. We also retouched a lot of licensed archive footage for our intro – a fictitious image film of Asperity Technologies Corporation – and integrated our own actors into our cockpit.
DP: What other means did you use to create the most immersive atmosphere possible for the user?
Sebastian Ingenfeld: The experience begins even before the user puts on the VR goggles. You experience “Asperity” exclusively on a replica of a space shuttle pilot’s chair, where you are initially strapped in tightly. Using a glove, the user can not only see their own hand in the virtual world, but also interact with it. In the finale, a joystick mounted on the chair becomes important. Within the story, it is very important that the user’s journey begins with the experienced pilot Charles Overmyer at his side – the astronaut sits right next to the player and confidently steers the shuttle.

The moment in which Charles Overmyer dies from one moment to the next, leaving the inexperienced user virtually alone, is all the more shocking. At the same time, the lights in the cockpit go out and all the air flows out of the cockpit. The lighting and sound mood changes from one second to the next. Very early on in the production phase, we experimented with the lighting of the instruments and the cockpit, so that the lights also adapt to the story and the mood. Outside the cockpit windows, we can also see the earth slowly turning from the sunny side to the dark side. At the sound level, the experience dispenses completely with audible music for most of the time. However, Pablo Knupfer makes the seat vibrate via a bass channel and mixes an atmospheric ambience from a variation of rockets and ventilation noises, as you might find inside a shuttle.
DP: What major technical problems arose during the course of the project and how did you solve them? Were there any ideas that you weren’t able to realise?
Sebastian Ingenfeld: There were always minor difficulties when exchanging assets, especially with our pilot. We had a complex pipeline here. Our TD had many more ideas for technical features that we unfortunately didn’t manage to realise. For example, we toyed with the idea of implementing additional communication with Mission Control using AI-controlled voice commands. Personally, I would now like to realise the project outside of VR – with a functioning, haptic replica of the shuttle cockpit and with stereoscopic projections outside the windows. That would be a dream!

DP: What resolutions did you work with to achieve the appropriate sharpness and image quality?
Sebastian Ingenfeld: The experience runs at 60 fps, the same applies to our video textures. The resolution of the VR goggles is still problematic – we had to enlarge the labelling and even some displays within the cockpit so that the letters and symbols could be deciphered at all. Unfortunately, we are limited by the hardware and its resolution. The cockpit textures are correspondingly high-resolution in the direction of flight – however, if you were to use the roomscale and stand up from your seat, you would quickly realise that we have saved a lot of resolution in unimportant places.

DP: What did you particularly like about the project?
Matthias Heim: “Asperity” was my first big project in virtual reality, a technology that really fascinates me.
Lena Lohfink: Sebastian Ingenfeld developed a complete experience with a background story (fictional company, image film, website, exhibition stand with spaceship and much more) from his diploma project “Asperity”. producing “Asperity” in its entirety was exciting, very varied and constantly presented me with new challenges and tasks that required all my previous skills to create something new (several voiceover recordings, real film shooting with set construction, VFX production, binaural sound recordings, game and app programming, etc.). I’ve grown from that.
Sebastian Ingenfeld: I particularly liked the subject matter and the opportunity to break out of my personal comfort zone of linear film. 360 degrees combined with the interaction really challenged me. That was great!
Pablo Knupfer: It was great for me to gain experience in the production of sound design for VR and spatial audio.

DP: Is it possible for interested parties to try out or view “Asperity” somewhere (e.g. on the website in the future)?
Sebastian Ingenfeld: “Asperity” is designed as an installation and can only be fully enjoyed as such. The pilot’s chair and our entire setup are interwoven with the experience, so the application will not be available online for the end user. We had our premiere at FMX 2018 and were fully booked every day of the trade fair. We are currently in talks with various customers to exhibit “Asperity” on a seasonal or permanent basis. Until then, you can find out when and where the experience will be available for a short time on our website.
DP: What are your hopes for the future of VR experience design? What technical issues do you think need to be fixed?
Sebastian Ingenfeld: For me, the biggest problem is the glasses: the physical resolution and the limited field of view. With our HTC Vive from 2017, the pixel grid is still very clearly recognisable. Being forced to use the Steam VR software also limited us in some features and – in my opinion – is not particularly user-friendly. A discreet background client with fewer adverts, no login and account requirements and no update panic would be much more pleasant.

DP: What’s next for you after the diploma project?
Lena Lohfink: Now that we’ve completed the project, we’re starting to commercialise it and are looking for a buyer for our product. At the same time, I’ll be looking for a job and continuing to produce great projects that inspire people.
Sebastian Ingenfeld: Together with Lena, I’m now looking after the exploitation and possible further development of “Asperity” – perhaps we’ll soon have a co-operation that makes everyone involved happy.



2 comments
Comments are closed.