Best practices for VR, from seven devs working with the Oculus RiftBy Kris Ligman
Gamasutra's Kris Ligman speaks with seven leading developers who are making games for the Oculus Rift VR goggles, as part of our Advanced Input/Output Week.
With the Oculus Rift serving as the forerunner in a new wave of interest in virtual reality technologies for games, Gamasutra sought out several developers who had managed to get their hands on a development kit -- and ask how their projects are coming along.
Julian Kantor (Groov), Lau Korsgaard (Spin the Bottle: Bumpie's Party), Robert Yang (Radiator) and E McNeill (Bombball) were among a select few chosen to exhibit their works from Oculus's VR Jam at IndieCade this October. Sigurdur Gunnarsson is currently at work building EVE: Valkyrie, an extension of CCP's EVE Online specifically designed for VR. Finally, Andre Maguire is creative director of Zombie Studios, currently working on the Rift-compatible Daylight.
Below, we asked these seven developers how they settled upon their concept, what they learned, and how they dealt with some of VR's biggest known issues.
When you set out on the project (be it for a jam, or a prototype), did you have a clear concept in mind?
Julian Kantor (Elevator Music): As the start of the jam was approaching, I didn't have too clear of a concept. Since I first heard about the Oculus Rift, watching videos of John Carmack from E3 2012 talking about how near and how awesome consumer VR would be, my mind had been racing with different ideas I could make into a VR game.
I got the opportunity to use a lot of those ideas in a game called The Recital, which I put together for E3 this past year at the Indiecade booth. I had four weeks to make that game (one more than I had during the VR jam!) and it was an awesome experience to get to make something and show it at E3, but I was sort of tapped out when it came to VR game ideas in the lead-up to the jam.
The Recital was a game about waking up from a dream, something I thought worked really well within the context of using the Oculus Rift. When you take off the Rift after being completely immersed inside of a virtual world, you almost feel as though you are waking up from some crazy dream.
[So] Elevator Music started with me riffing off of that idea, and turning into a more literal "virtual reality within a virtual reality" conceit. From there, I started developing Omnihedral Incorporated, the company whose mysterious and surreal corporate headquarters serves as the setting for the game.
Robert Yang (Nostrum): I started by working on a lion simulator where you eat misogynists, but then I watched Porco Rosso and realized that movie is actually very much about looking -- and many current flight simulators in development right now have complex controls, complex flight models, and closed cockpits. So one week in, I stashed my lion simulator away, and changed to that idea.
Robert Yang's Nostrum
Lau Korsgaard (Virtual Internet Hacker): Our goal from the beginning was to make something that looks as badass as we all imagined virtual reality in the 90s. We were less interested in mechanics and immersion and more interested in the social context around the play situation. How does the player look and feel? We really wanted to procedurally generate the levels based on the actual HTML code of the website you are hacking, but time constrains simply made that impossible. What we ended up with is much better: We grab a screenshot from the website you are trying to hack and project that onto the material of a predesigned level. This makes it much more obvious that they are hacking a real site instead of using obscure HTML no one can recognize.
E McNeill (Ciess): When I pick a project, I usually try to find a sweet spot between "what I want to do" and "what I'm capable of doing". It was immediately clear that a cyberspace hacking game was perfect, since it would allow a lot of design flexibility and it would let me de-emphasize the 3D art, which isn't my strength.
I didn't do any significant prototyping of other game concepts. That said, I had lots of ideas of what I wanted to experience in VR. Luckily, so did other developers, and in many cases they're already making the games that I want to play. For example, I wanted to make a game about skydiving in a wingsuit, and there's already Volo Airsport, The Wingsuit Madness, and AaaaaAAaaaAAAaaAAAAaAAAAA!!! for the Awesome available. The dev community is doing a lot of public prototyping, and we're all learning from each other about what works and what doesn't.
Sigurdur Gunnarsson (EVE: Valkyrie): At first we just wanted to make a game for VR since we were very excited about the Oculus Rift. Once we had gathered a small group of people, we started discussing ideas and this one quickly became the most favored one - especially since it was a subject close to our hearts (spaceships) and used the EVE universe. Having it based in the EVE universe also allowed us to reuse assets from EVE Online, giving us shortcuts in development time.
What was your goal going in to the project, creatively speaking?
Kantor: The most appealing thing about VR to me is the opportunity to be fully immersed in a fully-realized story world, so I wanted to create a story-based game with a unique, surreal environment that would be both wondrous and sort of terrifying to explore.
Yang: To make something that sufficiently spoke to gamers' sensibilities while not being totally boring and instantly knowable.
McNeill: I like to use game jams as a chance to create prototypes for ideas that I've had for a while but had been procrastinating on. A tight deadline and a light dose of competition is the perfect motivator for me.
I had already been playing around with the idea of a VR cyberspace hacking game for a long time, so my goal in the VR jam was to finally realize it and evaluate it. The jam was the perfect opportunity for me to actually test the idea out in the real world, and to get some VR dev experience to boot.
E McNeill's Ciess
Oculus has been pretty transparent about many of the big obstacles to VR design -- in particular, simulation sickness, latency, and UI. Did you run aground of these problems during the project? How did you choose to address them?
Kantor: For me, the biggest challenge has been in designing a way for players to move around freely in the virtual environment without getting simulation sickness.
I've seen a lot of games that circumvent this issue entirely by making the entire game control through head-tracking only, which feels great and is a natural fit for the platform, but it seems like it would be a prohibitively limiting constraint if every successful VR game had to follow it.
The big problem with standard FPS controls with the Rift is that players who have this control scheme hard-wired into their brains have a very hard time turning off their reflexes to control a VR game as if it was any other first person game. This can cause some serious nausea, since players will be looking wildly around with the right stick as they are moving, all while their inner ear is convinced that they looking straight forward and not moving.
I've found that the amount of simulation sickness decreases remarkably when players use their heads and bodies to turn and inspect the space around them, rather than using the controller to do so. I've been experimenting with control schemes to try to facilitate this sort of play, but I'm not sure I've landed on anything totally satisfactory yet. I think the closest I've come to was with The Recital.
I mapped turning to the left and right bumpers, and forward movement to the right trigger -- a spin on classic, almost universally-maligned tank controls. My thinking was that by purposely breaking our reflexes to use the control to navigate the space, we will instead rely on our instincts to look around for ourselves, only using the controller to turn by an amount that would be physically uncomfortable. When I showed The Recital at E3, I got very few complaints about physical discomfort or simulation sickness, which I attribute both to the control scheme and the environmental design.
Maguire (Daylight): I can give you a perfect example [that we encountered in Daylight]. In a mouse-and-keyboard or controller-based [FPS] you have arms to sort of root the player in their orientation, but when you’re moving around in the VR thing and your arms are locked to your view, it feels like you’re tumbling through the environment. It was completely disorienting. So even detaching those so that you have free head movement, kind of like a turret, cut way down on motion sickness.
[Regarding latency] we're going through it iteratively, and figuring out ways to improve that. The first thing we did was detach head movement from the arms. The next thing is likely going to be the resolution. Another is just making sure the player has good visibility. Tuning up the field of view, making sure the lighting is set up so you don’t feel you’re not in control, is going to be key for us.
Obstacles facing VR (continued):
Yang: I get VR sickness from, like, every single game on the Rift. Valve talked a lot about how "speed" is a major contributor to VR sickness... but is it about the speed, or is it about the perceived notion of speed? What if I minimized speed indicators like surface detail and foreground? To me, this experiment in art style went well with an animated treatment of Mediterranean islands. I don't really know if this technique works or anything, but at least it doesn't make it worse.
Korsgaard: So the UI we have is actually meant confuse rather than give clarity. We print left aligned text directly and flat on the screen. This makes it really blurry and semi transparent for the player, they only see it with their left eye, but spectators can read it perfectly well. The social context is really important to us when we design games, even virtual reality games, so a lot of our UI is meant to be read by spectators. While you play you frantically tap on the keyboard, and the joke is that spectators will see that what you are writing is actually meaningful code. This one of those decisions we have taken to make you feel cooler than you actually are.
For simulation sickness I very much agree with [Oculus founder Palmer Luckey and vice president of product Nate Mitchell]. Do not map a known control scheme on your VR game, make a control scheme that suits the affordances and constrains of the technology. I get really sick in games where I have the option to strafe or rotate the camera with anything but my head. The controls are pretty simple in Virtual Internet Hacker, charge up movement by hammering on the keyboard, look in a direction, hit space and you will shoot yourself in that direction. This makes looking around the key interaction while it still limit simulation sickness because you don’t change direction that often.
McNeill: These problems certainly exist, and I don't know the general best practices for solving them. I chose instead to circumvent them, creating an abstract game that was designed from the start to avoid these issues. For example, I limited the player to certain types of motion in order to avoid simulator sickness. I dealt with latency by keeping the graphics simple (to ensure a high framerate) and by reducing the need for the player to look around too quickly. UI design is always tricky for me, but I knew from the start that I wanted to keep the UI floating in the game world rather than having it stuck to the player like a usual game HUD, and so I designed the game systems to fit that constraint.
If you're trying to adapt an existing game to VR, you might not have all of those options. I think I made my work a lot easier by refusing to design a game based on a traditional genre. Instead, I started out by considering what would work well in VR, and I worked backwards from there.
Lau Korsgaard's Virtual Internet Hacker
Gunnarsson: Doing UI in virtual reality is definitely tricky. The old norms of placing UI elements at the edges of the screen don't work in VR, and having them follow your view is very disorienting.
We've found the best solution is to make the UI a part of the virtual world, either as part of your environment (cockpit dashboard, monitors and panels) or closely follow elements in your scene.. like a targeting reticle following an enemy spacecraft.
Simulation sickness is a very hard problem to solve, and will probably be one of the biggest issues to fix before we’ll see broad adoption of VR hardware in the next few years. The cause of it is the brain getting mixed signals about movement but not feeling any force.
In Valkyrie we've mitigated this problem in a few ways. Having a static cockpit around you, as well as a visible first person avatar helps ground you in the scene. Also having a constant forward momentum and not doing big sudden changes to velocity. As an example - If you go from 0 to 60 Km/hour in a fraction of a second, you will feel like you were kicked in the back.
Latency is another thing to keep a close eye on. When you turn your head the amount of pixels being shifted in one frame can get quite large, and this grows even larger when the framerate drops below 60 fps. This causes a noticeable lag between what you are seeing and what the brain believes you should be seeing, which causes simulation sickness. I agree with Carmack that we need to sacrifice some visual fidelity for a higher framerate, since it's a choice between being able to play the game comfortably and some eye candy.
Knowing the constraints this project involved, how did you determine what concept you'd work on?
Kantor: I kicked around a lot of different ideas in the run-up to the competition, but the concept for Elevator Music was the one that I kept coming back to. One of the inspirations for the game's concept was a dream I had in which I was riding an elevator in an impossibly huge office building. The idea of incomprehensible scale is something that really fascinates me, and I think it is a perfect theme to explore in virtual reality.
Yang: When I jam, I usually gravitate toward the smallest self-contained idea. The lion simulator was only going to be interesting if stuff happened, or if I implemented an ecosystem thing where you could chase prey or be hunted, etc. The flight simulator has much less complex expectations: you need a plane and a world to fly around in and that's pretty much it. Working in a genre is useful because it guides players' expectations one way.
McNeill: I usually choose a project by finding a good combination of what I want to do and what I'm actually able to do. A cyberspace game was perfect, since (I figured) it would be awesome in VR, and it would be simple enough to prototype in 3 weeks.
Gunnarsson: It shapes everything from content creation, to visual effects to programming. For example we need to keep a very high framerate and the scale of objects in the game needs to "feel right."
If you were to develop your current prototype game beyond what we've seen of it so far, what would you like to improve?
Kantor: During the run-up to Indiecade, I changed quite a few details of Elevator Music -- creating more music and sound design, making usability tweaks and fixing lots of glitches.
I have tons of things I would want to work on if I got the opportunity to keep working on it, but I think I would start with lots of iteration and playtesting on the puzzle and narrative design. Because I only had 3 weeks to make the game, I mostly focused on the audio-visual environment of the game during the duration of the jam, which is the aspect of making games that most interests me. However, the navigational aspect of Elevator Music is really its core mechanic from the player's perspective, and it can only be figured out through analyzing and cross-referencing information you find on the terminals hidden throughout the game.
Right now, tons of information is doled out all at once, and the game expects the player to be able to synthesize it all right off the bat. I want to try to spread out the information drip and knowledge gates as much as possible, and work on the narrative design to make sure the player has a clearer understanding of what they're trying to do, and how to do it, at all times.
Julian Kantor's Elevator Music
Yang: Right now, [my game's] world is procedurally generated, but I also want to procedurally generate different setpieces and plot chains. I want a sort of "narrative-based flight sim stealth roguelike."
Korsgaard: Global highscores! Imagine if you could "own" www.cia.gov until someone else hacked it better than you! -- oh and I would also totally love to make a big campaign around the game with full motion video cutscenes and other 90s stuff -- distributed on CD-ROMs.
McNeill: At this point, I'm definitely planning to develop a full version of Ciess, which I'd like to launch alongside the Oculus Rift.
I think that the mechanics of Ciess could use a serious reworking. The gameplay in the prototype feel good, like you're exploiting vulnerabilities in a vast system, but it doesn't have much depth. I'd like to have gameplay that's truly rich and strategic, and it's just not there yet.
I'd also like to make Ciess a showcase for all that's cool about VR. The prototype's biggest strength is that tries to make use of all the Rift's features (head-look, 3D effects, immersion, fully-surrounding environments). I'm hoping to add a lot more moments of "woah".
Maguire: This is all bit of a process of discovery right now. We're trying to figure out what's going to work and what our conventions will be. Figuring how we're going to do the movement, how we're going to do the interface -- these are the problems that we're faced with, and that we want to solve. We have some ideas of how we want to do that but there will certainly be some discoveries along the way.
What is one thing you believe devs need to keep in mind when working in VR?
Kantor: I don't think I can speak for other developers, as there are tons of completely valid approaches that different niches of players will enjoy more than others. But for me, both as a creator and a player, I am more interested in exploratory VR games that you can take at your own pace. There are all kinds of neat mechanics that can be derived from the head tracking built into the Rift, but what fundamentally excites me about VR is the opportunity to physically inhabit an otherwise inaccessible space.
Yang: Photorealism is pretty boring, and VR doesn't change that.
Korsgaard: Immersion doesn't have to be the goal! It is not even the most interesting part about VR. Lots of people see VR as an interface that finally let the players completely be immersed in the game, and blindly use that as the only goal for their development. I think that is a false promise; there are still lots of artificial layers between the player and the system -- they sit on a chair, they hold a controller in their hand, they get sick, they are scared of being blinded from the real world. Accept those premises of the technology, actually embrace those premises.
McNeill: A lot of the established conventions of modern games just don't fit in VR. Be warned. There's no screen corner where you can put your UI. Traditional camera controls feel uncomfortable. There's no way to ensure that the player is looking at your cutscene. Oculus has made it very easy to plug a Rift camera into your game, but that can be a curse in disguise; it's not truly as easy as it looks!
Gunnarsson: Pay attention to the scale of objects in your 3d world. You have been trained to perceive scale based on a number of things, including stereoscopic view, comparison with other objects of known sizes, parallax movement (needs positional tracking) and more. If you get the scale wrong you can feel like a dwarf or a giant, which in most cases is not the effect you are going after.
CCP's EVE: Valkyrie
Return to the full version of this article
Copyright © UBM Tech, All rights reserved