Immersive Tech/XR Designer & Developer
CityofSand_HelloKiddo.jpg

City of Sand

Role: Design + Development Lead, Producer

An interactive narrative VR experience for the Meta Quest 2 built as part of the Oculus Launch Pad program.

City of Sand is inspired by the true story of Bayocean, Oregon, a turn of the 20th century coastal resort town that was washed away into the Pacific when they built too much too fast.

Vertical Slice Core Collaborators: Robert Quillen Camp, Christopher Lane Davis, Jonathan David Martin, and Gil Ramirez.

A speaker contraption next to a dimly lit desk with the subtitle: "Hello, can you hear me kiddo?"

Role: Development + Design Lead, Producer

As one of the creators of City of Sand, I designed and developed rich VR interactions and accessibility features, worked with VFX and SFX designers to integrate their work into the vertical slice, and collaborated on overall project vision and strategy.


Timelines for Complex Narrative Moments

In City of Sand, the player’s interactions with objects around them often trigger elaborate reactions in the environment – there’s magic lurking in this world! We employed Unity timelines in order to integrate plentiful visual and sound effects in these transformative moments. Using timelines made it easier sync work from multiple members of the team, such as sound design and animation.

Similarly to my work on the Franklin’s Secret City AR app, I worked with the team to translate the narrative script into a spreadsheet of programmable actions, interactions, and triggers that were then built into the project’s timelines. I also built a timeline manager and debugging UI that allowed the team to skip through these timelines during development for quick testing.

 

Accessibility: Subtitles for Virtual Reality

Because dialogue is such an integral part of this narrative experience, I started prototyping subtitles early in the process. My preference was for in-world subtitles instead of ones that stay stuck to the player’s camera view. But finding a fixed location in the environment for the subtitles to live proved difficult, since we wanted players to gaze freely around the room. I drew inspiration from research on other models for subtitles in VR, especially those in Virtual Virtual Reality by Tender Claws, and built my own “sticky” in-world subtitles. The subtitles have multiple stick points around the room and move to whichever one is in the player’s view. This allows the subtitles to stay visible at all times without feeling like they are stuck on the player’s face.

Since we were already using Unity timelines to play dialogue in the project, I incorporated custom playables that made it easy for the team to sync the subtitle text UI with the dialogue audio clips.

Design/Dev Process Snapshot: Camera Interaction

To transition out of the first flashback, the player is presented with a camera to take a photo of the scene. When they take the picture, the camera flash transforms the scene into black and white, and the player zooms out of the flashback environment, which transforms into a two-dimensional photo. I worked on the camera interaction that triggers the transition.

Because the flashback takes place in the early 20th century, before cameras had standard shutter buttons, I researched camera shutter mechanisms of the time period. The project’s version of Bayocean, OR is a fantasy, not strictly historical, so ultimately I was looking for an interaction that felt engaging and matched the style of our vintage camera model. We decided that the squeeze bulb mechanism (which takes a picture by sending a puff of air through a tube and triggering the camera shutter to open) would be the most novel and fun interaction for this moment.

Initially, I built a mechanic in which the player must first grab the squeeze bulb using the typical Grip button on the touch controller and then “squeeze” the bulb with their Trigger button. Playtesting revealed that players had trouble figuring out the 2-step mechanic, and we received the feedback that using the Trigger button for the first time in the experience at this moment was not intuitive.

For the vertical slice, we decided to simplify the interaction and make it a 1-step process. Because several of our key interactions already involved simple grabs, I designed this one to be more of a yank. In order to trigger the photograph being taken, the player must pull the bulb beyond a certain radius around its original location.

Example of UI I designed for the project: