Well, it’s 100% different than initially conceived, and it took at least three times as long as budgeted for, but the first VR game I designed is now in Early Access on Steam.
Recreational Dreaming is a surreal “sleepwalking simulator”—one part exploration game, one part shooting gallery, one part scavenger hunt—for the HTC Vive. It’s a peaceful and kind game full of otherworldly events and sub-rational moods, and while it certainly won’t be everybody’s cup of tea, I’m proud and pleased that some people have fallen in love with VR through it.
Now, if you had told me a year and a half ago how much time and effort it would take to design and produce this VR game that has little chance of recouping the money we spent on hardware and software, much less replenish my desiccated savings account, would I still have made it? Yup. No question. Do I wish I knew then all the things I learned on the way? Oh hell yes.
So in that spirit, I wanted to write a post-mortem detailing not the development process (winding, recursive, both painful and joyous), but focusing instead on some of the lessons I learned about designing for VR. My hope is to give any designer starting out in this young and exciting medium a leg up, and to encourage anyone thinking about dipping their toes in to do so. I believe that the DNA of VR design is being written right now, and I’m incredibly excited to see what forms it will grow into.
I know that some of what I focus on in here will already be known or will be easily intuited by veteran designers, but the practices I’m writing about have particular importance to VR design. I’ve also included a lot of particulars that were significant to my process or deal with UX and technical issues specific to VR.
I want to thank the many people who helped me learn these things, most importantly Ryan Donaldson, my chief collaborator through most of development, and Becky Win, who offered dozens of priceless critiques over those many months.
So without further ado, the three biggest lessons from designing and developing a weird little VR game:
We got lucky here in a couple ways. First, our social circles include people of a lot of different backgrounds, from many different cultures, with many different ideas of what makes a fine time. Second, we were able to link up with community organizers like Joshua Young (of the PDX VR meetup Design Reality), Tim Reha (of the Khronos Group meetup in Seattle), and Steven Parton (of PDX Creative Networking) who let us set up and demo Recreational Dreaming at their events (thanks guys!). We also invited everyone we knew and met to our workspace to “check out VR.” Well over 100 people went through our game through the course of development, and every single person had something to teach us.
Sometimes you even get to nerd at people.
Who can you depend on for what perspective?
Other devs are great at finding things to break. If a collider is out of place, or a trigger can be made to fail, or a certain view causes the framerate to drop, these people will find it. I’ve also never met a dev who was at all shy about saying, “Wouldn’t it be cool if...?” and so often yes, it would be absolutely rad.
VR virgins are a bonus to morale (you suddenly become the coolest person they know because you make magic), but they are also a reminder of how you can’t depend on your audience being fluent in, say, The Lab.
For instance, we had designed our initial area to require an understanding of both shooting and teleporting—our central player actions—to proceed to more interesting areas. We noticed that many first-time metaversers had little trouble trying out the buttons, but because they weren’t used to teleportation as a method of travel, they would find themselves teleporting straight into a wall and staring confusedly into the gaping void. As entertaining as players’ reactions to this usually were, it was a problem: they weren’t making the connection between the line shooting out of their controller and the location they ended up. To solve this, we stuck some colliders around the edge of the room that would cause the teleport to fail unless they aimed closer to themselves. Giving them a little more time to comprehend the bezier curve of the teleport before they actually ended up somewhere else greatly improved their learning speed.
Hardcore gamers—not at all our target audience—challenged us to work on communicating the intent of our game more effectively. The playtester in Seattle who told us our peaceful, surreal exploration game “should be more like Call of Duty” is still basically a monster to us, but on our drive back down to Portland after we stopped raging about his insane philistinism we started work in earnest on our 20-second elevator pitch.
Poets, artists, psychologists, musicians, lawyers, etc., etc., each had unique personal reactions informed by their professional expertise. A friend who is a play therapist was responsible for crystallizing the game’s principle of “excitement without danger.” Poets were especially good at charting their engagement through the experience, often suggesting points of resonance we hadn’t noticed ourselves, and were very specific with their feedback.
I want to stress the importance here of a true diversity of playtesters. VR is MUCH more idiosyncratic an experience than screen games, and if you’re playtesting mostly among people that share certain characteristics (devs, dudes, a certain age range, members of a subculture or interest group, etc.), you are blinding yourself to the real strengths and challenges of the medium, and missing out on valuable perspective. Not everyone can look at a RPG combat system and offer meaningful critique about the DPS values, but everyone can look around them and say, “I don’t like it here because of [reasons].”
“...so what exactly don’t you like?”
Through this process we were able to understand not just many aesthetic perspectives, but also physical perspectives as well. If people wanted out of the experience, usually that meant we had to tighten our belts a little to boost the framerate back up, or tweak a visual effect, or replace a sound. If someone was a different height than us, well, maybe we had to lower the edge of the wall there so they could see over it, or increase the height of the ceiling so they don’t clip through it.
One of the most impactful playtests never even started. My friend who has only one hand was so excited to try my game, especially after I had introduced her to Tilt Brush and portions of The Lab. I had finished strapping the headset on her when I realized that because she could use only one controller at a time she would be unable to fire the slingshot, which meant she couldn’t get out of the first room. This lead to some soul-searching about what our game was really about, and caused me to think critically of this very central game mechanic.
It was through exploring this question of accessibility that I had failed (and would have continued failing) to design for that I found an answer that also addressed a problem we were having balancing the challenge of shooting. Depth perception over large distances isn’t great in VR, and we found if we offered the player no help they got frustrated trying to hit something too far away, but if we gave them an accurate aimline they found too easy. Consolidating all the game-necessary controls onto one controller allowed us to address both problems at once, increasing accessibility and giving us the impetus to design a new shooting mechanic.
The slingshot became an old metal flashlight with a gem for a bulb, which is not only way cooler than any of the slingshots we had thought up, but also
Fits better with the more casual pace of the game that had emerged in development
Balances the difficulty of shooting by introducing a subtle flashlight beam that doubles as an aiming guide and a pleasurable constant source of interaction
Uses a shooting mechanic that lets players gauge shot force by audio, visual, and temporal cues, rather than the distance between their controllers
The other controller is now a camcorder. Its functions are just sugar.
Which brings us to:
Don’t forget: VR as a medium is still in its infancy. Maybe it’s a toddler. But a lot of what is known to “work” in VR has not yet percolated into any book, and a lot of what can “work” has yet to be discovered, much less refined. In fact, I’d argue that it’s unlikely that anyone is a true master of VR design yet, at least by Malcolm Gladwell’s “10,000 hours” criteria. Lots of what works on a flat screen does not work in the metaverse. So give yourself the freedom to revise (and re-test, and learn from) everything.
There’s still 1,000 things that “had we but world enough, and time” to change or add or redesign about Recreational Dreaming we would, but through a brutal revision process we ended up with a game way more interesting than the one we were going to make.
Pine trees uprooting themselves to flock towards the Aurora Borealis was
not a first-draft idea.
A selection of what we revised:
Revision: Genre. The initial concept for the game was a skillshot game that required more spatial awareness than a professional pool-player has and more depth perception than an HMD with two screens an inch from your eyes can provide. But on paper it sounded totally achievable—”It’s like sniper-pachinko!”
Video game genres developed (and continue to) because they are what work well on a screen. What works well in VR is a different story, and I guarantee whole new genres will continue to spring up as the medium matures. Already there’s wave-shooters (Space Pirate Trainer, anyone?), busy-rooms (Job Simulator and Rick & Morty), rhythm-ninja (Audioshield), etc., and none of these games would be very compelling on a monitor, or with a gamepad. We eventually combined elements of walking simulators, shooting galleries, and scavenger hunts into an ultra-casual experience. Like any genre experiment, some players find Recreational Dreaming satisfying and some are left wanting more. But won’t it feel cool for the people who successfully break ground on a whole new VR tradition? And it’s not like you have much choice at this point in time but to try.
“A rose is a rose is a rose is a rose.” -Gertrude Stein
Revision: Shot variety and ammo. After picking our “shooting balls at stuff” mechanic, we wanted to see how far we could push it. We made the balls super-bouncy. We made them play by the rules of gravity, or not. We made them explode into other balls, fire out in bursts, go super fast or super slow. We made them huge and tiny, ethereally glowing and materially realistic. We made them translucent and put things inside them. We gave them a trail and then tried 30 different trails. We tried a dozen different noises for their bounce, and tried a dozen different effects on their deaths. Eventually we settled on six types of shot that stayed fun to use.
In trying to tackle how players would gain access to the different types of shots, we decided to complicate the game a little more by giving the player a limited amount of ammo and forcing them to find charging stations hidden throughout the levels to get more ammo. We thought this would reward thorough exploration and add some sense of consequence to each shot. And maybe it did, but I noticed that when we were demoing, the volunteer who’d often help us get people through the game had to remind a lot of players to recharge. They didn’t want to go hunting for ammo; they wanted to play around and look at the surreal environments.
I was loathe to cut the feature because we’d spent so much time on it (and the charging effect I had scripted was so cool!) but ultimately I decided that each level would feature one kind of shot. Players still get variety, but got to stop thinking about “the game” and keep focusing on the experience.
Sometimes people shouldn’t have to care how many shots they got left.
Revision: Starting room & tutorial. Reading instruction manuals is boring. Reading instruction manuals in pixelated VR-space is excruciating. Trying to remember controller diagrams when you find yourself in a strange new world in the middle of something gorgeous, and exciting, and scary, and weird, is impossible.
I mean, of course we tried the tool-tip thing. Then we tried to fit that into our metaphor and made the tooltips drift in and out with sometimes contradictory instructions. Then I scripted a sequence where both the button you needed to press to learn how to charge, shoot, or teleport, and the thing you needed to point at would glow the same noticeable color. But after we got rid of the charging stations and before I had gotten around to updating the controller tutorial script, players started figuring it out on their own. We had designed the starting room to require shooting the first target to create an exit, and to leave they need to teleport. With so few buttons on the Vive controller to choose from, and with the couple visual cues we’d included to point to the first target, all our playtesters discovered what to do using only their natural curiosity.
We had it pretty easy in this regard as Recreational Dreaming’s game mechanics are super simple. But the best-designed VR tutorials I’ve seen (The Lab, SUPERHOT VR, etc.) trust their players’ curiosity and willingness to just play around and see what happens, and give them a relatively safe space to experiment for themselves.
Revision: Every single visual element. Things really do look different in VR. We found that scale is extremely difficult to judge from the editor. Normal maps usually look best at less than half strength. Secondary normal maps really help import physicality into objects. Large billboard particles rotate unnaturally with your head. Colors are brighter in the headset. Trying to hand-draw skyboxes is insane and anyone who does it is a genius. People love lots of geometry and high-quality textures on the controllers. People don’t care about lots of geometry or quality textures on most anything more than ~10 meters away. The HDR camera is your friend. The Ambient Occlusion post-processing effect is not. Anti-aliasing is huge. Etc. etc. etc.
One of the biggest problems we had was figuring out the waves in the underwater level. I went through eleven different approaches to them—Unity’s “Water 4” prefab in the Standard Assets, linking up a matrix of transparent primitives with spring joints, making a huge cloth and animating its movement, all sorts of different particle systems—and none of them could get the look we wanted without killing performance. This was a side-project for over nine months. Finally, we started asking ourselves if the detail we wanted in the waves couldn’t be suggested somewhere else. The solution we ended up with highly abstracted waves—the simplest particle system we had tried—combined with a projector that coats the geometry in an animated series of caustic light refractions. Rather than forcing the player to look up to confirm they’re underwater, the evidence is on everything around them.
The Suggestion of Waves: New & Selected Poems
The takeaway here is that getting something to look okay in VR is more of a challenge than getting something to look good on the screen. Getting something to look magical in VR will probably take a lot of trial and error, a lot of time in the headset, and, if you’re lucky to have them, your artist-friends’ honest critiques.
As for level design, that’s enough of a beast to warrant its own section.
We started Recreational Dreaming with our minds still stuck in screen-space. Our first sketches at the rooms and events were all done on paper, and every single one revealed at least one major problem the first time it was modeled up. We revised this process a couple different times—elongating the storyboard, using different paints and brushes, constructing dioramas—until, in one of the biggest duh moments I’ve ever had, I sketched out a level in Tilt Brush.
VR painting programs changed everything.
The old way was sitting around for a bit trying to imagine what might look cool viewed from where, spending 10-30 minutes on sketching it out, handing the sketch to Ryan who might draw his own take on it, agreeing on a version, and then it taking him between 20 minutes and two hours to model up a draft in Blender, before then importing it into the scene in Unity, setting it up with a collider, tagging it “Teleportable”, strapping on the headset and, finally, checking it out. Then, to critique the design, whoever was in the headset would do their best to describe what might need changing and the other person would do their best to take notes. We were cavemen, basically.
With Tilt Brush (or Quill or Medium or whatever), revision happens on the fly, there’s no surprises about the experience of the space, and you can painlessly start iterating on the level’s color palette while you concept. Want to do large-scale planning? Zoom way out, look at it top-down. Want to see what the player will see? Zoom on into where they’ll be. And the product of your labor is a hundred times as useful: once we settled on a design it was no problem to save it out as a mesh or as reference art to guide the modeling. The new way is way faster, way more effective, and way more fun than the old way.
Concepting a swimming pool superimposing itself over a physicalized topographical
map of a lake just doesn’t seem like the job of a pencil.
Of course, the right tools can only get you so far. Some other considerations for designing spaces in VR:
Don’t forget traditional depth cues. This is an easy lesson to overlook; I see a lot of games that have large open spaces that seem flatter than they should. One can’t totally trust the HMD to effectively represent the difference between something 100 meters away and something 150 meters away, at least until HMDs aren’t two screens an inch or so away from your eyes. It still takes some good ol’-fashioned trickery to communicate depth in VR, and while some of this can happen at the engine level (fog! Use fog to suggest the color-leaching qualities of atmosphere!), much of this can be designed into your levels.
Breaking up large spaces with obfuscations at different distances reinforces the sense of parallax. Putting full 3D elements (clouds, rainbows, airplanes, etc.) in front of your skybox does too. Avoid using differently scaled versions of the same model in places where both instances will be viewed at once. And two or more parallel planes across space helps the brain decipher the distance.
Contrast is king. Whether it’s moving from a dark space to a bright space, from small to large, from blue to red, from quiet to loud, from realistic to magical, from safe to dangerous, from geometric to natural, etc., etc., or vice versa, the times a player is most excited is when they are experiencing something different. Your spaces exist in context of each other, and the most exquisitely detailed rooms will start to seem boring if they are in a sequence of similar rooms. There are a lot of painless little hacks that can make a huge different to your players’ experience, so make sure you’re taking a look at each environment or instance of your play space and asking, “What can be a noticeable change here?”
People feel most comfortable and most present around “man-made” objects. Maybe it’s that so many man-made objects are designed with similar tools to 3D modeling software used in games, or maybe it’s that they exist in our minds at a level of abstraction that we don’t apply to the natural world, or maybe it’s something more spiritual than that. I don’t know. But the further we abstracted the controllers, the less players liked them, and the more we populated our levels with treehouses, bridges, and rowboats, the more immersion our playtesters reported.
Perhaps unsurprisingly, few players have reported that
an ever-shifting tunnel of paint splotches feels “real”.
It’s really easy for people to get turned around in VR. Our first two levels are almost embarrassingly linear. This wasn’t our original intention. We found that the more we complicated paths, the more players ended up backtracking to where they started. If you are asking the player to traverse any distance, especially if you’re using teleport locomotion, please: make your paths simple. Give your players a consistent visual reference point (a mountain, a totem, a skyscraper) that they can see from anywhere in your level. Zone your level to have points of contrast within it (different colors, shapes, lighting, sounds, whatever).
This, I think, is one of the hardest aspects to have perspective on as a designer or developer. For you, the whole level exists as it does in your game engine; you’ve got it internally mapped from a god’s-eye view and can’t get lost. Your players only have their limited human perspective, and us humans need all the help we can get.
2 out of 10 playtesters successfully navigated this canyon path
in an early version of level 3, and both were other devs.
When I first considered designing for VR I was excited by possibilities that turned out to be impossible or ineffective with the current technology, and I imagine most VR designers have had similar enthusiasms. Don't be discouraged if your first dozen ideas turn out dumb! There are thousands of unique design problems for VR, and for every step the technology takes there will be hundreds more.
What I've come to understand is the chief joy in designing for this nascent medium is not the instant translation of one's imagination into "reality", but the intimate view it offers into the experience of being human. Appreciating other people's perspective, empathizing with their experience, and being able to give them such a deep slice of your perspective and experience is something this medium has the potential to do better than any other.
It might take it awhile to get there. It might even take a long minute for VR experiences to be reliably profitable. But as far as educational experiences go, I couldn't recommend anything more than carving a whole experience for other people.
And oh yeah! Give Recreational Dreaming a shot!
Donald Dunbar (@333uuu) is a VR designer and developer and a founder of the Portland-based VR studio Eyedrop. He is also the author of two books of poetry. Sign up for his very occasional newsletter here or check out his portfolio at donalddunbar.com.