The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.
The Oculus Rift virtual reality set is estimated to arrive in March for Kickstarters (and in April for pre-orders). I personally had the chance to try the headset at IndieCade East and found the technology both immersive and promising.
For those of you that do not know, the Oculus Rift is a virtual reality system with head-tracking that allows the player to look around in a 3D space with input solely from their neck. This head-tracking has three degrees of freedom (henceforth axes or axis-es), and it therefore supports anything from panning your head left or right (even over your shoulder), to arching your neck up or down (behind you if you lean back enough), to tilting your head left or right.
That input can be used to change the direction of not only video but also audio within the game world (although the system does not supply its own headphones). If a user hears a sound and instinctively turns their head, they will get a better understanding of the audio source from within the game world. Furthermore, the release version should update at 1000Hz, which will hopefully result in unnoticible screen/audio refresh latency (although developers will still want to keep their games within a reasonably high frames per second range).
What I found to be more interesting than the immension of this new piece of hardware, however, was the implications of the new head-tracking input system.
Instead of relying entirely on old control schemes, the Oculus Rift enables user interface designers to create control systems that are not based on the standard "wasd" and mouse or dual-analog layouts.
Space simulator (alternatively flight simulators):
The most interesting case, in my opinion, is the space simulator. The first set-up I imagined was one with four rules for the Oculus Rift input system:
- The player will move forward when looking forward (must be calibrated).
- The player gradually turns left when looking left, and vice versa.
- The player pitches up when the player looks up, and vice versa.
- The player does a barrel roll while they tilt their head.
Where, by using head-tracking to maneuver the ship, you free up--on controller vs. mouse and keyboard, respectively--dual analog sticks or "wasd" and mouse. For the controller, this does not only grant easy barrel rolls; it also allows each analog stick to control a gun.
Ever remember buying out both of those toy guns in two-player arcade shooters? Imagine that, but while controlling a space ship.
Mono-taskers beware, this means simultaneous control of 7 axes (3 from the head and 2x2 from your thumbs). Add additional buttons as your heart desires.
The first-person shooter has one free axis as a result of the head-tracking capabilities of the Oculus Rift, since the vertical component of the mouse and right analog stick (for pc and controllers respectively) are handled entirely by the Oculus Rift now. This essentially adds a new axis to control depth, particularly a zoom function for sniper rifles and a depth handler for weapons like mortars and grenades. Previous games, had trouble doing the same functions. Zoom, historically, has required either one toggle key (most shooters), two axis-like letter keys, or the scroll wheel. Mortars have historically required an implicit "hold-down for this long to shoot this far" technique (as in the case of Gears of War) or a "siege mode" that restricts player movement.
By adding the element of depth to the mix, first person shooters can invoke more engaging gameplay, which could lead to the creation of some very unique weapons.
As a generalization:
By adding these three new axes, the Oculus Rift has given game designers a few new toys to play with.
What can you imagine these three axes being used for?