Until now, “Sonar” could only be viewed on the Oculus GearVR, but now Philipp Maas, director and developer of the project, has also brought the 360-degree film to the Google headset (costs €1.99). Because with every new VR headset, the number of users who have not yet seen “Sonar” grows.
In addition to this second platform, the film has also been given a spatial audio update. The VR audio specialists DELTA Soundworks from Heidelberg have given “Sonar” a new, spatial sound experience. in 2014, this was not yet feasible for the students at the Film Academy, nor was the visual impression of depth through omni-directional stereo 3D.
The dearVR Spatial Connect tool from Dear Reality was used, allowing the film to be mixed while you are in virtual reality. There is even a mix available in the 3rd Order Ambisonic format, which the team will implement as soon as the platforms support it. Now the viewer is drawn into the eerie atmosphere of the space scenario even more than before.
DP: Many people want to make full CG 360-degree films, but very few know exactly how. How did you approach your first VR project?
Philipp Maas: “Sonar” began as a classic CG animation film in Cinemascope format. In order to be able to produce as independently as possible, I set up a 4TB network storage and two of my own workstations in our project room so that we had a self-sufficient LAN and enough rendering power available. I also decided early on to work with the GPU renderer Redshift. Autodesk Maya and ZBrush, as well as two additional workstations, were provided by the Film Academy. By chance, however, we got our hands on the Oculus Development Kit 1 via my brother Alexander Maas, who also wrote the music for “Sonar”. After the decision in favour of 360 degrees was made, we spent a week researching on the internet to answer questions such as: What is possible, what software should we use, and much more. We were very apprehensive about familiarising ourselves with a game engine like Unity in the short time we had left for the project, as none of us had any experience with interactivity. On the other hand, our goal was to create a passive, very cinematic experience anyway. Unfortunately, we had to do without some features in the process that would have been available to us in a game engine, such as stereoscopy and spatial audio. Fortunately, we were able to add these things later. It is also advantageous to only distribute a video, i.e. a pre-rendered VR experience, as the distribution of software with constantly new drivers, operating systems and headsets would have been much more time-consuming for us.
DP: As you have the difficulty of not being able to actively direct the viewer’s gaze in 360 degrees, how did you conceptualise the storytelling?
Philipp Maas: We gave this a lot of thought and the decision to move the camera was made very early on. A moving camera automatically indicates a direction of movement (nobody wants to sit in the passenger seat with their back to the direction of travel for a long time…). Even very clear points of interest, such as a moving spaceship with bright lights, naturally attract the viewer’s attention. In the design, we also made sure to use classic compositional elements such as converging lines and frames. The most important thing in every shot is to define a focus and give the viewer time to find it.
DP: How did you go about avoiding motion sickness for the viewer? How were you able to test it before the release?
Philipp Maas: We tried to show our animation colleagues animatics on the Oculus as early as possible to observe their behaviour and reactions. This gave us incredibly valuable feedback. We knew from the beginning that we couldn’t do without camera movement altogether. Slow movements are fine and we always have static shots in the film that give the viewer time to orientate and recover. However, the ability to rotate the head should not be taken away from the viewer, so only “rolling” the camera and thus tilting the horizon is acceptable because we are not constantly making this movement with our heads ourselves. Motion sickness can also be used intentionally as a stylistic device – namely when it makes sense dramaturgically and the film is supposed to be really physical.
DP: How long did you work on “Sonar” in total? With how many people?
Philipp Maas: The core production time was between April 2014 and July 2014. The team only consisted of three people: Dominik and I were responsible for the visuals and split the 3D and 2D work. My brother Alexander realised the music and sound design. in 2015, we then spent another three months working on the technical update between June and September. The app was developed in December 2015. We are currently mixing a new soundtrack in Ambisonic format in cooperation with Delta SoundWorks. The project always gives us the opportunity to find out new things. The basic concept of the film works, and it’s always worthwhile to take the film further technically.
“Sonar” was created as a student project at the Filmakademie Baden-Württemberg. More about the university here.