Master of Digital Media students developed a virtual reality experience for their Immersive Environments (IAT 445) class at SFU this past summer. Immersive Environments is an elective course that MDM students can choose to take to fulfill their electives requirement. In the course, 5 MDM students—Daniel Nascimento, Marina Lúcio, Guilherme Cunha, Alex Boyd and Victor Li— built Duralde’s Mission, a virtual reality experience that simulates a spaceship mission.
Here, the team talks about the lessons they learned while building the VR experience.
Duralde’s Mission is a virtual reality experience in which the player takes the role of an astronaut that has an important mission: test the first ship capable of jumping through hyperspace. An operator helps the players complete the mission by running them through procedures and solving eventual problems.
Our goal was to explore immersion and communication in a way unique to VR as a medium.
We wanted to contrast the isolating experience of being alone in space, with the social element of having someone as mission control supporting them over their headset. For many people, virtual reality and gaming are seen as isolating and anti-social activities. We feel that the most meaningful virtual reality experiences will be those that have a social element, so we wanted to put focus on this aspect.
3 Things We Learned Building a Virtual Reality Experience
1. Situate the experience in an unknown place.
Taking the players out of a known and familiar place, allowed us to focus on designing an environment that was optimal for VR. We designed our spaceship’s controls, audio, and other sensory aspects of the experience from the ground up for this medium, rather than trying to mimic something from real life.
2. Use core VR theories to guide your design—namely proto-presence, sensory immersion and social presence.
Developing for VR is very different than for a traditional screen. Rather than relying on well known best practices, we had to build and user test constantly in an agile fashion. However a few core theories of what others have found compelling in VR helped guide our design iterations, namely proto-presence, sensory immersion and social presence.
Proto-presence is achieved when there’s a sense of embodiment, a clear distinction between the body and the world. In particular, when a VR experience is able to give players a consistent sense of a virtual body. We added a virtual hand and joystick that mirrors the motion our players make with a physical joystick they are using for control. We also used the rotational and positional tracking the oculus headset provides to animate the virtual body and head of the player. This would allow the body and shadows to mirror all of the subtle movements players naturally make that would normally be lost when interacting as a virtual character.
To have solid tracking we also spent the time to heavily optimize our application to get a steady 75 frames per second. This is a vital aspect not only important for providing users with smooth tracking, but also for avoiding the motion sickness and strain that poorly built virtual reality experiences can cause for users.
For sensory immersion we focused on scoping out a small environment that allowed us to spend the bulk of our energy producing a few high quality assets, rather than worry about the breadth of content we needed to produce. Audio is an often overlooked but important aspect of sensory immersion. Our experience featured 3d spatial audio to give the players a better sense of their surroundings. Finally, used of a subwoofer in our physical rig to mimic the sound and rumble of the ship as it made hyperspace jumps.
The choice to have a live operator allowed us to focus on other aspects of immersion, such as the social, challenge, and imaginative elements. Through the role of the operator we were able to adapt to each player and co-construct a shared narrative.
3. Iterate, iterate, iterate.
On our first playtest session the players needed to point their head towards a button to select it, however we found that people were having trouble using the interface. On a later iteration we added a red dot that follows the direction that the player is facing and the interaction became way smoother.
From that point we built a skeleton of the experience, with only basic shapes and no textures. That allowed us to evolve the experience in cycles of research, implementation and testing. By iterating we quickly found and solved problems, only settling on the script and moving it to a more realistic art level once we were confident with the prototype.
Duralde’s Mission was featured at VIFF Industry’s VR Rises Event this Fall and had positive feedback from all the participants.
It was also showcased at EA at the DigiBC Summer BBQ and at an event this weekend with hundreds of attendees, including visiting members of the British Consulate.
See the student’s final project: