onebyonesquare.png

one by one (VR, 2019)

one by one : VR at the Global Game Jam

Screen-Shot-2019-02-11-at-12.55.41-AM.png

For the 2019 Global Game Jam I traveled up to Oculus HQ and worked on a team of seven to create one by one, a bittersweet VR arcade game targeting the Oculus Go. I designed the initial concept for the game, and led programming, Unity work, and project management. I also had an absolute blast.

The concept

You return to your childhood home, to your bedroom, and everything is seemingly exactly as you left it. Your bird is tweeting in its cage, your favorite stuffed animal sits against the wall, and even the clutter on your desk remains untouched. But you’re older. You’ve experienced new things. You’ve changed. How long can you keep your memories and attachments from degrading?

The mechanics

Having only two days to finish the game, we were immediately mindful of scope, and we looked for ways to ensure our vision would be achievable. Very quickly we were able to find a method of interaction that fit well with our take on the theme and was simple enough to get testing immediately. By limiting interaction to a gaze-based mechanic and disallowing locomotion, we engaged with the theme--holding onto the meaning of your treasured possessions--and ensured we wouldn’t need to create much of a tutorial to get people playing.

Our solution had many benefits:

  1. We engaged with the theme--holding onto the meaning of your treasured possessions.

  2. We ensured the player wouldn’t need much of a tutorial--no learning what the buttons do, no figuring out how to move from place to place, etc.

  3. Our central game mechanic was extremely lightweight. Because we knew how large the environment was, we were able create our gaze-based solution by using a trigger collider attached to the headset gameobject. This allowed for us to spend our performance budget on feedback, polish, and an ambitious sound design.

  4. We ensured we could easily port this experience to even lower-end VR systems like the GearVR.

But this also created some interesting challenges.

 
onebyoneFlashingAndHealing.gif

Design challenges

Our first central design challenges were: How do we ensure players understand their gaze is what’s affecting the world? How do we make looking feel material? How do we let players know they need to divert their attention elsewhere to stop another object from degrading? How do we create both that cause-effect connection, and supply them with that information during gameplay?

The solutions were feedback, feedback, feedback, feedback. We tried many different approaches, and found some that got the job done:

  1. Each object goes through five stages of degradation, the last of which is shattering/destruction. Our artists, Terra Snover and Kevin Castaneda, created multiple versions of the interactive objects in various states of decay so we could give clear feedback to the player on their progress.

  2. Our title screen doubled as our tutorial. We placed a blooming flower in front of the player and cued them to look behind them. Gazing away from the flower pot triggered some prompting text to appear (“Out of sight, out of mind”) and lead their gaze back to the flower, which was now wilted and dropping its last petal.

  3. By directionally randomized raycasting from the HMD gameobject and triggering particle systems, we were able to create an ambient sparkle at the center of the player’s vision, showing their gaze literally impacting the scenery of the room. We created another particle system with clear iconography to trigger when they’re looking at interactable objects--medical crosses to signify the object is getting healed.

  4. Sound design was at the center of our experience. Our composer, Jordan Chin, created a five-track song for the game, each track of which to be emitted from one of the interactable objects. In this way we were able to take advantage of the 3D sound to help players monitor the objects out of sight. As an object degrades, its track blends into, and is eventually replaced by a glitched version of that track. We also had interactable objects emit a sound when they reached a new level of degradation, and, foley sounds as well as spoken vignettes when gazed at. Stephane Lallee, who was in charge of implementing the sound in Unity, found FMOD especially helpful for these complex behaviors.

  5. We animated a realtime shadowed point-light to move between actively degrading objects. Because objects in the room would cast shadows based on where the light was emitting from, the player is able to visually echolocate where they need to turn their attention.

Technical challenges

Of course, we faced a game-jam-appropriate amount of unwelcome surprises in getting the game playable. Happily, even with our very limited development time, we were able to find solutions to each of these and finish with a fully playable game.

  1. Though the Go is more powerful than it looks, we still needed to do a lot of optimization to ensure a good framerate. We were careful about setting up our physics layers to ensure we weren’t checking for too many collisions. We also opted to disable the mixed lighting in favor of 100% realtime lighting, trading a little ambiance for a good performance boost. Careful management of texture sizes and mipmaps allowed us to save detail for where it would be most noticeable to the player.

  2. Unity 2018.3 does not play nice with the Go, apparently. Because we identified this early on, we were able to recreate the project in Unity 2018.2 before we had spent much time creating prefabs, which would’ve had to have been recreated in the new project.

  3. The Go we were testing on was supplied by Oculus, and had a Oculus-specific developer menu, as well as particular compatibility issues with certain flavors of the Android SDK. With help from Dan Miller, a Unity evangelist helping out at the jam, we were able to figure out that Android SDK 26, which I had used for previous Go projects, wouldn’t work. We moved our build target to SDK 25.

What we’d have done differently

Thankfully, our team communicated really well throughout the jam, and each member got to contribute to many areas of the design and development. Ashley Reed, our narrative designer, did an amazing job fleshing out a story from a pretty bare concept and recorded some great voice acting from members of our team and other helpful jammers. Tiana Le created gorgeous branding for the game that really captured its nostalgic spirit. Everyone helped out with many iterations of QA. The camaraderie kept our spirits high and allowed us to tackle problems that arose quickly. There are, however, some things we should have set up differently:

  1. We should have spent some time at the beginning of the jam training everyone up on a basic Git GUI. Though a few of us were comfortable using version control, there were some hitches with Git that might have been avoided, and there was some time spent transferring assets from one computer to another to then push to the repo.

  2. We should have given the artists more specific technical guidelines. The models and textures the artists created were beautifully done and stylistically consistent, but a clear rubric for how the meshes should be exported would have helped limit our draw calls (in an already CPU-heavy game), and guidelines for material creation would have saved some work on normal maps that had to be removed for performance considerations.

  3. We should have had a plan B for presenting our game to the rest of the attendees. After such intense work, it was a real disappointment that the solution the jam provided to broadcast directly from the Go glitched out during our presentation, allowing us to show only a brief snippet of our game. If we had captured some video beforehand we could have focused more on our accomplishments than on “...so it still isn’t showing up...”

  4. I’d have brought something comfier than a yoga mat to sleep on.


Try it out!