Rock(et)Ship Design Rationale

Personal Information

 

 

 

 

 

 

 

Name

Bart Oude Elferink

Contact details

Phone: 06 13908146
Email: Bartboris@hotmail.com

HvA student number

500708575

Course

Communication
& Multimedia Design

Educational institution

Hogeschool
van Amsterdam

Start date

16-4-2018

End date

5-11-2018

Graduation supervisor

Roey Tsemah

Company

Storytrooper

The Product Biography can be viewed here:

Rock(et)Ship Product Biography

A live version of this website (in case this document is viewed offline) with embedded videos is live at:

www.unusualrenders.com/rocketship-design-rationale/


 

1. Introduction

My client Storytrooper is a storytelling company owned by Christina Mercken that organizes immersive workshops for anybody who is interested in a good story. She asked me to explore new storytelling experiences for her company using Virtual Reality (VR).

The product I worked on is a location-based multiplayer VR act that can travel across events and includes wirelessly interconnected Android smartphones placed in Gear VR headsets. It involves a light-hearted mini story that can be enjoyed by all ages. The final iteration of my project can be seen as a first draft or starting point for a story-driven group experience that can be fleshed out once the hardware catches up in the coming year(s).

This project is a hands-on research about cooperation, social elements and the current possibilities of local VR multiplayer gameplay. It’s trying to tackle the challenges and restrictions that come with developing for mobile VR and synchronizing game states over a wireless network.

3D render of the experience with three players logged in

 


 

2. Context

In the past years, as technology arises and our attention span tends to get shorter, Storytrooper would like to create new storytelling experiences that intrigue participants and keeps them engaged for longer timeframes – thus creating a stronger storytelling experience using technology, sound and visuals.

Enriching her act with new experimental technology also offers opportunities for expanding her audience and revenue. She noticed upcoming Virtual Reality initiatives like the VR Arcade and Virtuality, where groups of people can come together to fully emerge themselves in a virtual environment. However, most of these places are either focussed on experiencing the newest AR and VR hardware (like the recently opened Virtuality) or only offer a selection of experiences from existing digital stores (e.g. VR Gamehouse). There is one exception to this trend: the VR Arcade in Amsterdam. They offer their visitors a tailor-made multiplayer VR experience involving wireless free-roaming and shooting at zombies/aliens. Also, all of these places are bound to a set location and boast pretty hefty pricetags: most tickets start at €25-30 for half an hour (Unbound VR, 2018).

The third-party software that’s available doesn’t fit the repertoire of Storytrooper (because the experiences are set and cannot be altered to tell one of her own stories), so she needed a tailor-made application and somebody to help her create one. This is where this project came to life; together with Christina and Sammie, I wanted to explore the possibilities that story-driven VR can offer for groups. Their purpose was to be able to show a proof of concept that excites potential investors in the coming year.

 

Christina speaking at Mezrab – Photos courtesy of Mezrab.nl

 

 

 

 


Christina telling one of her stories at the weekly storytelling night
(Videofootage by Martin van Houwelingen)


3. User Description

Although Storytrooper’s audiences vary greatly, I’ve focused my research for this project on early adopters of VR. These are young (20-30) and open-minded people who love storytelling and are curious about new technologies.

Early VR adopters are people who are already interested in or familiar with VR. They are eager to try innovations and tend to accept this innovation more than the average consumer. The product I’m working on is still a proof of concept and therefore is positioned at the very start of the adoption lifecycle. Targeting this specific group has the benefit that participants often already have experience with using VR and therefore they should be more able to see through the flaws that come with a proof of concept (since they are aware of the newest innovations by full-grown companies and know what a fully developed product can be like).

Position of my product on the Technology Adaption Lifecycle model – image courtesy of Wikimedia

4. Concept Development

Design Challenge

My design challenge for this project is as follows:

‘How can a virtual reality experience

create immersive storytelling sessions for groups,

keeping them intrigued and engaged* ?’

*Within this project’s context, with intrigued and engaged I mean the player’s interests in moving along with the story and his urge to explore how the story develops.

Main research questions

– How can cooperation be stimulated within a multiplayer VR environment?
– What gameplay, narrative, and social elements cause players to stay engaged?

Sub-question

What forms of interaction can be utilized to make the gameplay feel intuitive
and immediately understandable?

Product Vision

Using virtual reality, we create gamified experiences for groups on location, allowing them to solve fun and challenging quests in a unique and immersive way.

Product Description

To meet the requirements of my product vision and to solve the design challenge, I came up with the concept of a location-based multiplayer VR-act that can travel across events and involves large decor pieces and an actor to supervise the act. The players would experience a story together and must rely on each other to successfully complete the experience.

The challenges that the project was facing at this point after I defined the product vision were:

  • What hardware and software would be most fitting for the product?
  • What would be the story and how many scenes would be there?
  • How would people interact with certain elements?

Hardware

Because Christina is often working at remote locations, it’s important for her VR-act to be mobile enough to take with her. It also shouldn’t be too much of a hassle to set up on the location she’ll be performing.
I chose the combination Gear VR + Samsung Galaxy S8 because this setup is mobile and the phones can be quickly swapped out when the battery is low or in case the system crashes. My full considerations on this choice can be read in Appendix 2: Hardware research of my product biography.

Samsung Gear VR and Galaxy S8 phone

Software

To develop the VR application I used Unity 3D, a game development platform for Windows and MacOS. It can be used to develop many different applications for a wide variety of devices. It can even be used to create non-gaming apps that are normally built in programs that require more hands-on coding (Celada, Jan. 2015).

The Appendix 3: Developing for Gear VR of the product biography contains more information about Unity, how I used a cloud-based solution for wireless multiplayer and how to set up the Android development environment.

Scenario & Script

Close your eyes for a moment. Now imagine walking around on a festival with your friends and suddenly you all get involved in a mission to space. Drifting among the stars you’ll soon feel like you’re part of something bigger than yourself, and you must rely on your friends to succeed in this wacky quest…

 

For the story, we first had a brainstorm with the whole team and, after some iterations, the concept and story became as follows:

You and some friends are walking around on a festival when suddenly you discover a giant spaceship made out of cardboard, sheets etc. Next to the ship, there is an actor dressed as a child playing spaceship. He asks you to join him on an adventure to space and he shows you around the inside of the spaceship. After putting on a cardboard helmet with a pair of VR goggles inside of it, you transform into a virtual crewmember. From here, you have to work together with the rest of the players to operate the spaceship. After a while, the ship crashes on an alien planet and the players meet a group of aliens who turn out to react to music. By combining scrap from the crashed spaceship they can either build instruments or weapons. If players decide to play along with the aliens, this will result in a massive rave and the aliens transport the players back to Earth. If they decide to attack the aliens, after a while they will be crushed by one of the alien’s giant friends.

The full storyboard, script and story arc can be found in Appendix 4: Script and Storyboard of the product biography.

The decor of the experience

 

 

 

 

 

 

 

 

 

Part of the storyboard

Methods

Playtesting / Speaking out Loud

The method I used the most is playtesting the experience with test subjects and having them speak their observations out loud. I would record this playtests on video and analyse the behaviour of the players, so any possible issues or uncertainties could be resolved in a next iteration of the demo. I have included a detailed log of these tests in Appendix 6: Prototyping log of the product biography.

The Octalysis framework by Yu-Kai Chou

To find out how to make the experience as much fun as possible, I looked into (video)game design as the experience is essentially set up like a multiplayer VR game.

One of the most popular places at the moment to learn about the fundamentals of gamification and behavioral design is Yu-Kai Chou’s Gamification & Behavioral Design. This framework lays out the structure for analyzing the driving forces behind the motivation of users to keep using your product or service. According to his model, gamification is a design principle that mainly concerns the process of human motivation.

I implemented the Empowerment (of Creativity & Feedback) and Unpredictability (& Curiosity) Core Drives in my product. Details on what they stand for can be found in Appendix 7: the Octalysis Framework of my product biography.

 

A fully filled-in Octalysis framework by Yu-Kai Chou. Image courtsey of yukaichou.com

Task Analysis

One of the tools I used during the UX design is Task Analysis. A Task Analysis consists of a diagram that shows the actions taken by users to achieve their goals. The main goal of laying out these steps is to find out the nuances, motivations and reasons behind each action taken. This way, any unclear or unnecessary steps can be removed from the final design (Interaction Design Foundation, 2018). I performed a task analysis for every user test in the Appendix 6: Prototyping log of the product biography.

 

 

Example of a Task Analysis: texting a hospital system – Author/Copyright holder: Andreas Komninos, The Interaction Design Foundation. Copyright terms and licence: CC BY-SA 3.0

The Lean Startup method

The Lean Startup is a method for setting up a new product through continuous innovation using a build-measure-learn feedback loop. I would make an assumption or have an insight. Then I’d sketch or propose a fitting solution, decide on what and how to implement, build it and try it out with one or more test subjects. I would go through these sprints over and over, each time improving my product bit by bit. This method is very valuable when you’re working on a product where you don’t really know what’s gonna come up, which was really the case with my project.

 

A diagram of the Lean Startup method. Image courtesy of theleanstartup.com

 


 

Interaction tests

Because the Gear VR doesn’t support motion tracking of the hands (which was included in the first concept), I was limited to using gaze input and the included Gear VR controller. Therefore I decided to first explore some of the possibilities gaze input offers. This was important because I first needed to know which interaction methods I would want to implement in the final product.

After playing with all the different methods of interaction, I had a pretty good overview of all that was possible for the final product and which elements I thought could work out the best. Ultimately, I decided to use curiousity and playfullness as key components to get users emerged in the situation and encourage them to behave as I intented by my design. I went along with gaze input along controller input for a while but eventually dropped the former completely during design sprint 3 because combining them felt clumsy at some points and using a motion controller felt more intuitive. Furthermore, demanding two different types of input could even be confusing to unexperienced users.

Full details on these tests can be found in Appendix 5: Interaction Test environments of the product biography.

Sketch of all Gaze-interactions that I could think of
Prototyping different types of interaction

 

 

5. Design Sprints / Minimum Viable Product (MVP)

During development I went with a 2-player set-up because I had 2 headsets available and this way I could test in a faster way. The product could potentially be scaled up for groups of 16 players.

MVP 0.1 - Proof of Concept

Main scene in sprint 1

Summary

The first prototype was a setup with two players in different places of a spaceship. The player in the cockpit has to communicate the colors on his dashboard to the player in the back. When the player in the back presses the right buttons, the spaceship engine starts. I took this game mechanic from the games Keep Talking and Nobody Explodes and Spaceteam. These cooperative games revolve around each player having different visual cues that can only be seen by that particular player. Then he has to communicate this and the other player(s) needs to anticipate on it by performing certain actions that are only available within their own play space. This prototype was intended as a first proof of concept of how teamwork in VR could work.

Testing sprint 1

Tests and results can be viewed within 4. Design Sprints and Appendix 6: Prototyping Log of the Product Biography


MVP 0.2 - Setting the scene

Main scene in sprint 2

Summary

This second prototype included the first story elements and had players interact more intuitively with their environment. I found that allowing players interact with their environment in a direct way (mashing buttons by moving the controller) and rewarding them with amusing feedback for doing so caused them to stay more engaged. Players were now seated next to each other so that they could see the other player’s head move around. This added a social element which also kept them engaged.

Changes in this version:

Exit gaze input

I decided to focus solely on controller input since it’s a more direct and intuitive way of interacting with elements in the direct environment. At this point, only logging in at the lobby is still performed by gaze input.

Position of the players

The players were now seated in a circle around a dashboard, instead of one player in the front of the ship and the rest of the crew in the back. This way the social presence is strengthened because fellow player’s movements are easier to spot.

New gameplay elements

I added a sequence for getting the ship up in the air by creating music together. The buttons in front of the player can be used to create a rhythm with odd sounds. This sequence is meant to make players aware of how the buttons work, triggering their curiosity and playfulness while doing so.

Testing sprint 2

Tests and results can be viewed within 4. Design Sprints and Appendix 6: Prototyping Log of the Product Biography


MVP 0.3 - Introducing multiplayer mechanics

The final act in sprint 3 – visual instructions are shown on the screens, accompanied by audial instructions from the narrator

Summary

For the final act of the demo (working together to avoid getting the spaceship hit) I expanded upon my mechanic from design sprint 1. In this scene the players see buttons appear on their own display that they have to shout to the other players that are in reach of that particular button. If the button is pushed on time, the ship will avoid colliding with objects but if not the objects will crash into the ship. The players can’t die or ‘Game Over’ while doing this, so groups of players won’t get stuck during the experience but they’ll still receive feedback from the narrator. After a while, the game moves on to the next scene automatically.

Changes in this version:

Log-in room overhaul

I removed the voice-over and gaze input from the log-in room and replaced it with visual instructions and controller input. This way the log-in sequence can be timed by the player himself. There is background music playing to give feedback about the volume of the headset so players can adjust the audio level before any important voice-over starts playing.

Old way (left) VS new way (right) of logging in
‘Selfie-display’ – feedback about the look of the player

Because players were unaware of their own avatar’s appearance, I added player-cams to the dashboards of the players in the main scene. This display shows live footage of the player moving during the experience. This way the player gets clear feedback about what he looks like in the game.

Old way of giving feedback on what the player looks like (left) VS new way (right)

 

Testing sprint 3

Tests and results can be viewed within 4. Design Sprints and Appendix 6: Prototyping Log of the Product Biography


MVP 0.4 - Rethinking multiplayer and onboarding elements

 

Summary

The final act felt chaotic to players because they thought the buttons on their screens corresponded to the buttons on their own panel because they look too similar. Also, the how-and-why of suddenly having an avatar at the start of the experience remained unclear up to this sprint.
I decided to implement new mechanics to stimulate teamwork within the environment and transforming players into members of the space crew. This MVP did a much greater job of making players work together and feel more connected with the game world overall.

Changes in this version:
Voice chat!

The only way to communicate with other players was to take off or lower the volume of the headphones, which would break the immersion.
So I added voice chat to the experience, players could now hear each other talk without having to remove the earphones. This should increase the social presence and the overall ease of communicating with other players.

New login instructions

While playing Fail! Factory on my Oculus Go I stumbled upon an even better way to provide instructions on recentering the controller.

Fail Factory’s instructions VS my instructions using this format
New multiplayer mechanics for the final act

In his presentation at the Game Developers Conference 2015, Alistair Aitcheson (an independent developer of silly and chaotic room-scale multiplayer games) phrases the following quote:

My job as a designer is not to create elegant systems.

It is to engineer interesting social situations.

Alistair Aitcheson with some of his quirky inventions. Images courtesy of Alistair Aitcheson.

This point of view was very interesting to me personally and for my development process, as I intend to stimulate fun social interaction over a perfectly streamlined VR game. My concept lends itself to be a lot more ‘outside the box’ and this led me to implement new mechanics in the final act to stimulate teamwork and a feeling of social presence.

Concept : Mash THE button!

During the Ubicomp course in 2016, I came up with this idea of a physically active game that involves running between colored blocks to hit the corresponding block once a lamp on top of a pole changes to that color. I took the idea from the Pokémon Stadium 2 minigame ‘Pichu’s Power Plant’ and the Guitar Hero game franchise. This mechanic is also found in Test your Strength games often situated at fun fairs. I found this a great idea to re-use because we had a lot of fun playing around with the lo-fi prototype back in the day and it complements the player’s enthusiasm during the first 3 sprints of playing around with the buttons.

In the concept, there is a progress meter on the lamp post over the entire vertical axis. Players need to mash the same button as the lamp’s color, but the lamp keeps changing color over time and mashing the wrong button or not hitting anything at all causes the meter to slowly decrease.

Prototype of this concept built with Arduino microcontrollers (2016)

New way of receiving your avatar

Because the how-and-why of suddenly having an avatar at the start of the experience remained unclear up to this sprint, I thought of a new mechanic for transforming players into members of the space crew (as described in Appendix 4: script and storyboard of the product biography).

Concept : The avatar toy dispenser

Personally, I’m quite fond of toy dispensers where you put in a coin and a plastic ball with a random toy rolls out of the machine or opening a Kinder surprise egg. I thought this concept of receiving a different avatar everytime you enter the experience would be a great idea to implement in the onboarding process. This concept again thrives on the player’s Unpredictability & Curiosity Core drive to keep him coming back to see what avatar he’ll receive this playthrough. I decided to use this concept in my product since the toy-like assets fit this concept really well and the randomness factor should theoretically give the replayability of the final product a boost.

Testing sprint 4

Tests and results can be viewed within 4. Design Sprints and Appendix 6: Prototyping Log of the Product Biography

MVP 0.5 - The final demo / last refinements

Summary

This demo is the final version for the scope of this graduation project. It includes the last refinements and feedback from playtests 5, 6 and 7.

Adjustments in the last 2 project weeks:

  • Slower change interval of lamp colors.
  • The lamp is a bit smaller so it can be viewed from better angles.
  • The buttons are more apart and the hitboxes of the hands are made smaller.
  • Narrative is added to the start.
  • Some panels were added at the start to better sync player states.
Testing sprint 5

Tests and results can be viewed within 4. Design Sprints and Appendix 6: Prototyping Log of the Product Biography


This is the final iteration of the demo, as seen by Player 1:

This is a video of the experience with 2 players logged in to showcase the demo’s context:

6. Conclusion

Conclusion

During my research, I found that allowing players to play around with interactive elements in their environment created a social element because players would talk about it to each other. Using direct input by providing players with a hand to hit brightful colored buttons felt intuitive to them and is something that’s immediately understandable. Rewarding them with amusing feedback for playing around with the interaction points caused them to stay more engaged.

The final MVP is a great starting point for a story-driven group experience to be fleshed out by Storytrooper once more advanced hardware is released. Christina was completely unfamiliar with VR at the start of this project and was pleased to see how her story would translate to this platform when the project was finished. Although the final experience doesn’t require very intensive cooperation, players mostly enjoy being able to have the freedom to try things out themselves and making remarks on it to each other. The biggest value of the final demo turned out to be not necessarily the focus on teamwork, but rather on having each individual making fun on their own while still experiencing a shared world. Using comical narrative was also a key element to keep players engaged and intrigued to find out what the next objective would be.

The biggest challenges I faced during development were the restrictions of the underdeveloped hardware and my inexperience with synchronizing online gameplay. I’m especially disappointed that the hand-tracking within the Photon Network didn’t work out, being able to wave to each other really adds to the social part.

Would I have started this project 1-1,5 year later with the experience I have now, I would’ve gone with the newly announced Oculus Quest headset. This would have allowed me to implement a lot more of the initial ideas because it allows players to walk around using positional inside-out tracking. Also, the better tracking of hands would mean players can actually pick up objects and move them around the spaceship which would really have been a game changer.

 

Christina and Sammie trying out the demo

Appendix: Reflection

Process

My goal during this project was to create a multiplayer VR game and cope with all the challenges I would face along the way. It was interesting to conclude my final year-and-a-half experiments with game design and AR/VR in this way. Because I was allowed a lot of creative freedom, I worked hard from the very start and dived head-first into experimenting. The project quickly became something I worked on with passion and I could have continued working on it for a much longer period than the scope of this graduation project. It didn’t feel like a mandatory study assignment but it really was something that was my own and kept me motivated as it slowly took the shape it is now. There were many difficulties along the way and sometimes I had to work around issues I couldn’t figure out, but every step forward gave me a lot of energy to move on. I disliked the restrictions of the underdeveloped hardware and would have preferred using a more advanced platform. This would have allowed me to implement a lot more of the initial ideas. But again, this is just a matter of time because more appropriate hardware is already announced to come out in spring 2019.

Product

The final product, although a bit rough around the edges, does a good job showcasing a ‘VR-act’ in my opinion. The overall silliness of the characters and situations puts a smile on many participants faces and they’re often disappointed the story doesn’t move past the crash sequence. I think it can be seen as a first draft or starting point for a story-driven group experience because it shows the potential of a light-hearted mini story that can be enjoyed by all ages.

Personal

I’ve learned a lot about the principles of multiplayer gameplay and the basics of Photon Unity Networking. I still haven’t mastered many of the aspects, but the project feels like a good start. My strength lies in converting a design problem into a (concrete) solution and being able to build Virtual and Augmented Reality prototypes that demonstrate the idea or workings behind the conceived concept. This way I can make concepts that use these techniques more concrete during presentations and pitches. After my graduation I’d like to continue experimenting with these kinds of demo’s in the field of Virtual- and Augmented reality alongside my job as a 2D/3D animator.

Suggest Edit
Stuur me een WhatsApp