Gamasutra: The Art & Business of Making Gamesspacer
View All     RSS
October 22, 2014
arrowPress Releases
October 22, 2014
PR Newswire
View All
View All     Submit Event





If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 
The Oculus Rift and its New Input Axes
by Matthew Downey on 02/28/13 12:05:00 am   Featured Blogs

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

 

The Oculus Rift virtual reality set is estimated to arrive in March for Kickstarters (and in April for pre-orders).  I personally had the chance to try the headset at IndieCade East and found the technology both immersive and promising.

For those of you that do not know, the Oculus Rift is a virtual reality system with head-tracking that allows the player to look around in a 3D space with input solely from their neck. This head-tracking has three degrees of freedom (henceforth axes or axis-es), and it therefore supports anything from panning your head left or right (even over your shoulder), to arching your neck up or down (behind you if you lean back enough), to tilting your head left or right.

The Oculus Rift Virtual Reality Headset
                                                       more pictures

That input can be used to change the direction of not only video but also audio within the game world (although the system does not supply its own headphones).  If a user hears a sound and instinctively turns their head, they will get a better understanding of the audio source from within the game world.  Furthermore, the release version should update at 1000Hz, which will hopefully result in unnoticible screen/audio refresh latency (although developers will still want to keep their games within a reasonably high frames per second range).

What I found to be more interesting than the immension of this new piece of hardware, however, was the implications of the new head-tracking input system.

Instead of relying entirely on old control schemes, the Oculus Rift enables user interface designers to create control systems that are not based on the standard "wasd" and mouse or dual-analog layouts. 



Space simulator (alternatively flight simulators):

The most interesting case, in my opinion, is the space simulator.  The first set-up I imagined was one with four rules for the Oculus Rift input system:

  1. The player will move forward when looking forward (must be calibrated).
  2. The player gradually turns left when looking left, and vice versa.
  3. The player pitches up when the player looks up, and vice versa.
  4. The player does a barrel roll while they tilt their head.


Star Fox 64 -- Barrel Roll
                                               But infinite.

Where, by using head-tracking to maneuver the ship, you free up--on controller vs. mouse and keyboard, respectively--dual analog sticks or "wasd" and mouse.  For the controller, this does not only grant easy barrel rolls; it also allows each analog stick to control a gun.

Ever remember buying out both of those toy guns in two-player arcade shooters?  Imagine that, but while controlling a space ship.

Mono-taskers beware, this means simultaneous control of 7 axes (3 from the head and 2x2 from your thumbs).  Add additional buttons as your heart desires.



First-person shooter:

The first-person shooter has one free axis as a result of the head-tracking capabilities of the Oculus Rift, since the vertical component of the mouse and right analog stick (for pc and controllers respectively) are handled entirely by the Oculus Rift now.  This essentially adds a new axis to control depth, particularly a zoom function for sniper rifles and a depth handler for weapons like mortars and grenades.  Previous games, had trouble doing the same functions.  Zoom, historically, has required either one toggle key (most shooters), two axis-like letter keys, or the scroll wheel.  Mortars have historically required an implicit "hold-down for this long to shoot this far" technique (as in the case of Gears of War) or a "siege mode" that restricts player movement.

By adding the element of depth to the mix, first person shooters can invoke more engaging gameplay, which could lead to the creation of some very unique weapons.



As a generalization:

By adding these three new axes, the Oculus Rift has given game designers a few new toys to play with.

What can you imagine these three axes being used for?


Related Jobs

Avalanche Studios
Avalanche Studios — New York, New York, United States
[10.22.14]

UI Programmer
Avalanche Studios
Avalanche Studios — New York, New York, United States
[10.22.14]

UI Artist/Designer
Wargaming.net
Wargaming.net — Hunt Valley, Maryland, United States
[10.22.14]

Lead UI Engineer
Wargaming.net
Wargaming.net — Chicago, Illinois, United States
[10.22.14]

Lead UI Engineer






Comments


Ian Fisch
profile image
I don't think using your head angle to aiming your gun in an fps is the optimal solution. I think people would prefer to use a motion-tracked controller or a standard gamepad.

I don't think it should be used to control the pitch and yaw of a spaceship either. I think your head should be used for looking around your environment. Otherwise, the headset might as well just be a cramped tv monitor.

Luis Guimaraes
profile image
I definitely don't want to use my head to aim in an FPS.

Matthew Downey
profile image
That's a valid point now that I think about it (the space simulater would feel like a TV monitor again, the cool part about the Oculus Rift is being able to look at the displays and controls within the flight deck). It was meant for an arcade-y space shooter, in my defense.

With respect to the first person shooters, that's how they set up the system in the first person demo: the right analog cannot be used to look up or down (I had to learn how to use my neck). The demo seemed to be intended for an RPG and a sci-fi shooter. However, that was as the developers saw the tech, it's still our choice as game designers on how to use it.

Team Fortress 2 should give us a good idea on how developers can handle--and audience can cope with--the tech in a FPS when it unveils late this March at GDC.

Jannis Froese
profile image
I definitely look forward to the immersion of using Star Citizen (a space simulator) with a joystick controlling the ship and Oculus rift for looking around.

However it's true that we never fully solved the problem of arbitrary 3D movement in space. I could imagine a something like your input mode as a separate input mode for dog fighting. Then I could finally do a barrel roll while turning 180 degrees backward and firing my back thrusters, all with a head movement and speed control on the keyboard/joystick/controller.
Actually, I think I totally need this :)

Steven Christian
profile image
I agree, head movement should be for looking around, not aiming.
This way we can add an extra dimension to gaming; not be restricted to using it as a new mouse/joystick.

Matthew Downey
profile image
It seems like you do aim with your head in Team Fortress 2's version of VR.

“…it’s really impressive and easily the best use of 3D technology I’ve ever seen… At one point I shotgun a demoman in the motor pool of Badwater, then switch to my pistol whip a volley of shots over my left shoulder (yes, it really felt that way) to drop a soldier coming down the ramp. Watching him fall, I start to feel like VR is something I could get used to.”- PCGamerN

http://www.oculusvr.com/blog/team-fortress-2-in-the-oculus-rift/


none
 
Comment: