Minimal VR system
My recommendation would be to support head tracking (rotations + translations), tracking of at least one hand (rotations + translations), and a joystick with a couple of buttons. From my personal experience, when you have this minimum setup, you cross a threshold, and your brain much more easily accepts this other reality.
This means that, for me, the Oculus Rift by itself is not (yet) a minimum VR platform. It's missing head position tracking and doesn't provide any kind of hand tracking. I know you can easily add it yourself with devices such as the Razer Hydra or others. But unless we have a complete VR platform, game developers can't rely on the fact that players all have the same standard hardware.
Latency
The first enemy of VR is latency. If you move your head in the real world and the resulting image takes one second to appear, your brain will not accept that this image is related to the head movement. Moreover as a result, you will probably get sick. John Carmack reports that "something magical happens when your lag is less than 20 milliseconds: the world looks stable!"
Some researchers even advise a 4ms end-to-end latency from the moment you act to the moment the resulting image is displayed. To give you an idea of what this means, when your game runs at 60 frames per second it's 16ms from one frame to another. Add to that the latency of your input device, which can range from a few milliseconds to more than 100ms with the Kinect, and the latency of the display, which also ranges from a few milliseconds to more than 50ms for some consumer HMDs.
And if you want to run your game in stereoscopy, keep in mind that the game needs to compute the left and right images for each frame. As a game developer, you can't do much for the input and display latency, but you have to make sure that your game runs fast!
For more information about latency, I recommend these great articles by Michael Abrash and John Carmack (my personal heroes): "Latency, the sine qua non of AR and VR" and "Latency mitigation strategies."
A Coherent World, Not Necessarily a Realistic One
We have seen that perceptive presence requires you to fool your senses in the most realistic way. Cognitive presence -- fooling the mind, not the senses -- results from a sense that your actions have effects on the virtual environment, and that these events are credible. This means that you must believe in the "rules" of the simulation. For this, you must make sure that your world is coherent, not necessarily realistic. If a player can grab a particular glass, for example, but can't grab another one, it will break presence because the rules are not consistent. Once cognitive presence is broken, it's very difficult to "fix" it. The player is constantly reminded that the simulation is not real, and it will take some time to accept it again as reality.
If you're targeting a visually realistic environment, it is more likely to generate breaks in presence. This is because your brain will expect many things that we are not yet able to achieve technically: perfect physics, sound, force feedback so that your hand doesn't penetrate an object, objects breaking in pieces, smell, etc. Having a non-realistic environment will lower your expectations that everything should be perfect, resulting in a more consistent presence feeling.
If you manage to achieve cognitive presence, and fool the mind of your player, the events from the simulation will affect his sensations. If an attractive character looks at a shy guy into the eyes, his heart rate might increase, he will blush, etc. People with a fear of public speaking will react with anxiety if speaking to a virtual audience.
This is why the application I still find the most immersive is "Verdun 1916-Time Machine." It fools many senses at a time: vision, smell, touch... But the most important point is that, by design of the "experience," the interactions are extremely simple: you can only rotate your head, because you're a wounded soldier.
Given that extreme limitation, it's extremely simple to keep the player from experiencing a break in presence. You can't move your hand, so it cannot penetrate objects; you aren't forced to navigate with an unnatural joystick. It has been reported several times that some people smiled at another virtual soldier that came to save the player in the simulation!
Measure Presence
The problem is that it's very difficult to concretely measure whether a player feels present in the world. There are currently no absolute indicators for that. You can measure the heart rate or skin conductance if you want to evaluate anxiety. But this is only relevant for stressful simulations.
What you can try to evaluate though is if the player is responding naturally. We already mentioned a few natural reactions: trying to catch a ball, fear of heights near a cliff, fear for your virtual body if somebody is trying to hurt you, trying to avoid collisions...
|
I used to work for Virtuality back in the day. We specialised in VR games for the arcade, so it wasn't just research labs and enthusiasts. We also did plenty of work for those industries you mentioned in the article too, which actually proved to be a lot more lucrative. Unfortunately, back then (mid-90's) industries were quite scared of sticking their necks out with this new tech, although there were a quite a few which did and worked really well for them, mainly advertising.
It's great to see that the tech is making a resurgence, and very affordable too, so I'm hoping to see some interesting apps/games pop up.
I've had a play with the Oculus Rift too, and wasn't expecting much, but, boy was I surprised! And this is coming from a VR 'veteran' :)
Thanks for the job you did so long ago :)
When are we going to get together and remake Exorex or Dactyl for Oculus then? :p
I am sure I have the code lying around somewhere! ;)
I just want to re-iterate what Tarique said about "not expecting much" but being actually blown away by my first go in an Oculus. It was also a pleasure to meet the Oculus guys at Fanfest in Iceland this year.
I am excited to see where all this goes...
Stephen.
One thing worth mentioning, in relation to your article, is sound. This sometimes took a backseat during development, which is a shame as I would always argue that getting the sound right was as important as visuals, to create a more immersive experience for the player.
As well as ambient sounds, intelligently placing sounds in 3D space adds so much to the overall experience. For instance, sounds of a bubbling stream would come from the direction of the stream itself. Elements like this would be used to direct the player to either head in that direction, or make them notice things outside their current field of view.
Ideas like this came about as we noticed that a number of gamers would never look around, due to their unfamiliarity with VR (or might have been due to having a heavy dustbin strapped to their heads!). Adding audio cues outside of their field of view helped a great deal.
Coupled with this, adding gameplay elements slightly outside the players fov is also good technique. After all, it's a VR experience, and we want them to actually look around, rather than have everything happen right in front of them!
You're also right about converting existing titles to VR. It's hard, and they just don't work very well. I was tasked to do this on a previous contract, converting 'Soldier of Fortune', and a couple of other titles to VR, and there was far too much to be done to even get it remotely working well.
Differences in FOV and ability to look around would cause major issues with the culling, etc. Of course, being (fast paced) first person shooters, they don't really lend themselves to playing with a tethered headset unless you want to strangle yourself with the cabling! Coming up with new, more innovative game mechanics is the way to go, and a lot of thought needs to go into the design of the environments, so people don't get themselves tangled up, which takes out of being 'present'.
Anyway, fabulous article, and a great primer into VR development.
Your point about sound is totally right, we have experienced it many times.
Adding basic sounds is "easy" and adds so much to immersion !
And yes you have to know and incorporate the constraints of your hardware (Cables, resolution etc..) in your game design, just like you do with a keyboard or mouse !
To be honest having tried vr on and off for years it never really grabbed me the way I wanted it to, its always been kind of gimmicky, but the Oculus has really done something right. Really looking forward to seeing their consumer model, hopefully with the resolution upgrade etcc.
Anyway, just wanted to say, Im kind of envious of you guys, you put in your dues but I think your field of gaming is about to pay off and take the industry by storm.
best of luck to you.
I am interested in how a technology like Oculus Rift can be used in a third person action game. I mean all the demos I have seen of it have been in first person. Does it make sense to use it for third person?
thank you !
That's a great question!
Some people say it will work:
http://www.roadtovr.com/2013/05/10/vr-and-3rd-person-an-unexpected -world-of-poss
ibilities-5697
Neuroscientist also show that your brain can accept a third person representation of your body:
http://www.youtube.com/watch?v=mD7NzrBgXwM
So in theory it should be possible, but I think it will probably be harder to achieve presence
because your brain has to accept the whole avatar as your own representation.
Your brain is used to first person view, that's its natural way of living.
In first person view, you "only" need to create some parts of your body correctly.
But especially you don't need to recreate a whole realistic face.
It's already very hard to create a realistic face (of somebody else) in realtime 3D that you can accept,
it's even harder to accept it as your own! You would have to replic all mimics nearly perfectly.
I have no doubt we will get there soon.
The other possibility is that you act as a kind of god, and the avatar is not exactly you, it's your agent, which would make things easier..
This obviously has lots of design implications, as if you are designing an immersive 3rd person game, you need to make sure that the "person" you become is one that could see the entirety of the world.
My top concern though is the controls, as the Oculus will have trouble launching without a comfortable control scheme. Keyboards and controllers are the most direct, but also not especially conducive to the players presence. However, tracking motion gives me an idea...
programming 3D- I'd hate to get stuck in the middle of gymbal lock within a VR environment...
Great Informative Article!!!
I would love to see 3rd Person games too in VR as you can always adopt psyc of god mode or consider your self as acomplice or guide of main protegonist who executes your wish...its matter of approch you adopt then in this case there is no need to avataar to resemble your looks, is it?
I have played Dead Space 1,2,3 Mass effect 1,2,3 Dregon Age 1,2 and Darksiders2 all seem way perfect with few glitches in S3D with Tridef Ignition on my LGD2342P and with dark room and my Seinheiser Head phones seating just a feet away of 23" 3D monitor playing all these games in 3D make me feel inside game, of course not truly part of game but yes being inside game i already had this feeling.
I know with HMD like RIFT it will be 10 Fold magnified but then i feel sad when people represent sarcastic or skeptical views on 3rd Person Games in VR.
They will be surely magnificent if you had followed basic guidelines already mentioned by you.
I also strongly believe that adopting Flexible OLED (Bendable) display is more ideal in future RIFT as it can asure 180 Digree Horizontal FOV which i think is Must for fully immersive experience. With HD Screen and nice head tracking it can give "MATRIX" Experince many including me crave for. we already have 10.1" Tablets with 2048x1536 which can translet in 1024X1536 per eye and things are improving
I would also like your view on Variable 3D depth model where depth being high in centre of screen and decresing gradually towards both end of screens being minimum there. i think this is realworld scenario how we look the world being depth highest at 90" and lowest to NIL at 0 Digree and 180 Digress, i don know what would be hardles adopting it! but it could help a lot in keeping interface/HUD at both corners besides tactical HUD like DEAD SPACE is great but thats beyound logical for mytho or historical games besides can not be applied in FPS Games.
For those of you interested in ways to measure presence in your games/virtual worlds (and experience in general), you can check out this paper I wrote a few years back (http://www.academia.edu/930920/Virtual_Experience_Test_A_virtual_environment_eva
luation_questionnaire). I'm happy to answer any questions you might have about how to use the VET if you decide to look into it further.
@Sébastien Kuntz - did we meet when I was briefly consulting for Ian the Virtuality Dev Director?