Gamasutra: The Art & Business of Making Gamesspacer
On Motion Sickness and the Handling of 2D UI Elements in Virtual Reality Environments
Printer-Friendly VersionPrinter-Friendly Version
arrowPress Releases
April 18, 2014
PR Newswire
View All
View All     Submit Event





If you enjoy reading this site, you might also want to check out these UBM TechWeb sites:


 
On Motion Sickness and the Handling of 2D UI Elements in Virtual Reality Environments
by Isla Schanuel on 12/12/13 01:06:00 pm   Featured Blogs

The following blog post, unless otherwise noted, was written by a member of Gamasutras community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

 

Playing games while using an HMD, especially in first-person, can make even experienced users feel nauseous when playing for extended periods of time. Motion sickness can have severe negative effects on player experience, and if games are to be successful the issue of motion sickness must be addressed at all stages of development. It's not enough to trust that a game could simply be "converted" to a new display type without issue; if a game is to function properly and provide an enjoyable player experience1 then it must be designed in such a way as to represent all game elements, from the environment itself to the actual user interface, in such a way as to minimize the likelihood that they will cause motion sickness in players.

While much has been written about the topic of visually induced motion sickness (VIMS) in the context of virtual environments, this post will be focusing specifically on the possible effects 2D UI elements have on VIMS when using virtual reality headsets. This post will also explore how those effects might be mitigated through adjustments made to the way these elements are presented to the player.

Causes of VR Motion Sickness

Motion sickness can arise in any situation where the visual information provided to the brain is inconsistent with what is being reported by the vestibular system. The going theory at the moment is that motion sickness is a side-effect of an evolutionary defense mechanism against neurotoxins2, as the brain's area postrema is responsible both for resolving conflicts between what we see and what we feel and for making us vomit if we think we've been poisoned, but there are alternate hypotheses3. In the case of visually induced motion sickness your eyes report motion while the information provided by the structures in your inner-ear say you’re either sitting inert, or report a different type of motion entirely4.

In virtual environments, our potential causes of VIMS come from two sources. The first is the disconnect between what our body is doing and the in-game feedback we receive, while the second occurs in situations in which stand-alone objects move in ways which are contrary to what our brain expects to see.

In the case of the former, the disconnect comes from those elements of the vestibular system responsible for keeping our vision steady under normal conditions. Specifically, the vestibulo-ocular reflex (VOR) which works by taking information provided by the inner-ear about your head's location and orientation and using it to counteract head movement by moving the eyes in the opposite location, and the otolith organs, which are used to track the head orientation and movement. 

The second cause is slightly more complicated, as it is a matter of objects outside ourselves and how our brains perceive current and expected movement. Despite how “steady” our vision may appear, It's impossible to move one’s eyes completely steadily in a single movement unless have some sort of target to track or upon which we can focus. This is because our eyes are constantly scanning, stopping, and refocusing as your brain takes in the world around you. These rapid movements are called saccades, and cannot be controlled consciously. The brain will actually re-adjust the size and speed of saccades in order to ensure that we are seeing things properly.


   Trace of saccades of the human eye

When we are tracking a moving object, however, it's a completely different process called smooth pursuit. Most people can smoothly track objects moving horizontally and vertically-downwards at speeds up to 30° of our field of view per second; any faster than that and our eyes start "skipping" forward to catch up ("catch-up saccades"), and we are better at tracking objects moving horizontally or vertically-downwards than upward-moving targets.

Smooth pursuit is a two-step process in which the first step is to "catch" the object you're going to track. It lasts about ~100 milliseconds, during which your eyes will still dart about a bit until they can fix on the target's velocity and direction of movement, and our brain starts making calculations about the path it will take. The second stage consists of the automatic correction and control of eye movement to closely follow whatever we're looking at, and lasts until the thing we're tracking stops following that particular path.

This means that when we are watching a moving object, real or virtual, our brain is not only keeping track of where it is at any given point in time, but also of where it will go in addition to our own physical state and orientation relative to that object. Once again, it’s the disconnect between what we expect to see, based on where we know ourselves to be and what we actually see happening, that causes motion sickness.

Implementing a 2D UI In A Virtual Environment

Given that designing and implementing full 3D interfaces is a non-trivial process, a common interim solution is simply to overlay traditional 2D UI widgets such as menus and HUDs “on top” of the virtual environment. This technique is significantly easier to implement and allows developers to reuse traditional UI metaphors and toolkits, but typically involves UI elements being projected at a specific depth and fixed location inside the user’s field of view so they are perceived to be “stuck” somewhere in front of one’s face.

This can, of course, result in a number of potential problems for the user if not handled carefully. The most well-known issue is that of depth-order violations (see below)5, but factors such as FOV size6, the location of elements in one’s peripheral vision7, and element’s reactivity to head movement8 can all have significant effects on the incidence of motion sickness.


Example of a depth-order violation in Half Life 2. The depth queues associated with the player weapon model indicate it is further away than the recharge station, but it has been rendered in front of it. (How to view)

Mitigating VIMS

There are ways to decrease the incidence and severity of VIMS, however. At present, the most common treatments for motion sickness are the use of medication, or setup modifications such as those currently being collected by the Oculus Rift and TF2 communities to help users adjust settings and develop their “VR Legs”. Unfortunately, these solutions does very little to actually eliminate motion sickness or prevent the problem from popping up in the first place. 

There remains a need for developers to adjust the ways in which their games behave when being played on HMDs in order to prevent, or at least minimize, the effects of VIMS, but understanding which techniques are most effective is going to require a great deal of experimentation.

Oculus has engineers looking into how predicting head movement might be used to reduce the latency between when a user’s head moves and when the view appears to move as this is a common source of motion sickness. However, while the initial reports about this technique’s efficacy have been promising, the belief that such a method could enable the automatic translation of games built for flat screens to HMDs seems almost impractically optimistic. 

Additionally, there is research that indicates establishing a “base frame” (in which certain visual elements9 or haptic feedback10 cues VR are linked directly to real-world motion and orientation) can reduce the incidence of motion sickness. It’s possible that these obvious links to real-world conditions could have detrimental effects on immersion, but given the lack of research available on the subject, we simply have no way of knowing at the current point in time.

Given that a completely static 2D element in a virtual environment constitutes a violation of what the brain expects to see, and what is actually being presented, there exists a need to find ways in which to present these UI elements so as to decrease the likelihood that they will exacerbate the problem of motion sickness during gameplay.

Case Study: Quake2VR

I worked with Luke Groeninger to create a proof of concept solution utilizing his engine mod Quake2VR. The mod itself adds native Oculus Rift support to id Software's Quake II and includes a number of changes meant specifically to address the issue of motion sickness (e.g. allowing users to adjust the size and depth of the HUD projection and retaining the use of head tracking while using menus). There is one feature that may go unnoticed when playing with the Oculus Rift. It becomes much more obvious, however, if you mirror what the player sees in the HMD onto an external screen: the 2D user interface elements appear to “jiggle”. 

From a theoretical perspective, this system through which this accomplished can be thought of as an attempt to achieve the opposite result of optical image stabilization. In this case the movement of the user’s head are used to make 2D elements appear as if they are “trailing behind” in the local frame of reference by rotating them opposite how the user moved their head. By calculating the necessary information from changes in the HMD orientation that is used by the game engine this technique is able to leverage vendor-supplied motion prediction, and ultimately provides a much tighter coupling between the 2D UI elements and the surrounding virtual environment.

(A technical report containing a detailed write-up of the technique described above is currently in review. It is expected to be available through the University of Colorado in early 2014.)

An early proof-of-concept implementation demonstrating how a 2D UI can be made to react to head motion.

While these effects are typically unnoticed while playing on a Rift, on flat screens the 2D elements appear to behave more like balloons being pulled through the air, rather than stickers that have been applied directly to the lenses of one’s glasses. 

Unfortunately, despite the fact that the features described above are completely functional and provide an enjoyable user experience (at least, according to preliminary user feedback), they have not been subject to any sort of structured user testing. As such, it is impossible to determine whether or not the methods of presenting 2D UI elements used in Quake2VR are actually effective at preventing motion sickness. The mod does, however, serve as an interesting proof-of-concept example against which the popular methods may be objectively tested at a later point in time. 

Conclusion

As the use of HMDs for gaming grows in popularity, so too will the need for strategies for addressing VIMS. While many of the root causes of motion sickness may be addressed simply through incremental technological advances, it is important not to ignore the importance of how visual content is presented to players. Many of the issues with the Oculus Rift may indeed be solved as time progresses and new users provide both product and game-developers with a wider set of test subjects upon which to experiment. However, there is still a need for determining which methods are most effective at addressing motion sickness, and doing so will take significant amounts of trial and error until we can establish a solid set of best practices from which to operate.

Further Reading

On Vision and VR

References

1. Lee, C., Rincon, G., Meyer, G., Hollerer, T., and Bowman, D. A., “The Effects of Visual Realism on Search Tasks in Mixed Reality Simulation”, IEEE Transactions on Visualization and Computer Graphics, vol. 19, no. 4, pp. 547 - 556, 2013.

2. Trisman, M. (1977). Motion sickness: An evolutionary hypothesis. Science, 197, 493–495.

3. Bowins, B. (2010). Motion sickness: A negative reinforcement model. Brain Research Bulletin, Volume 81, Issue 1, 7–11.

4. Kennedy, R.S., Drexler, J., & Kennedy, R.C. (2010) Research in visually induced motion sickness. Applied Ergonomics, Volume 41, Issue 4, 494–503.

5. Ludwig, J. (2013) Lessons learned porting Team Fortress 2 to Virtual Reality. Game Developer Conference 2013

6. Lin, J.J. W., Duh, H. B. L., Parker, D. E., Abi-Rached, H. , and Furness, T. A. “Effects of field of view on presence, enjoyment, memory, and simulator sickness in a virtual environmentVirtual Reality Conference, 164-171, Aug. 2002.

7. Seya, Y., Sato, K., Kimura, Y., Ookubo, A., Yamagata, H., Fujikake, H., Yamamoto, Y., & Ikeda, I. (2009). "Effects of peripheral visual information on performance of video game with hemi-spherical immersive projection screenBreaking New Ground: Innovation in Games, Play, Practice and Theory, September, 2009, London, United Kingdom.

8. Patterson, F.R. and Muth, E.R. (2010) Cybersickness Onset With Reflexive Head Movements During Land and Shipboard Head-Mounted Display Flight Simulation. Naval Aerospace Medical Research Laboratory: Pensacola (USA). 2010: 10-43.

9. Prothero, J. D., Draper, M. H., Furness, T. A., Parker, D. A., & Wells, M. J. (1999). "The use of an independent visual background to reduce simulator side-effects". Aviation, Space, and Environmental Medicine, 70(3), 277-283.

10. Lindeman, R.W., Sibert, J.L., & Hahn, J.K. Towards usable VR: an empirical study of user interfaces for immersive virtual environments, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, p.64-71, May 15-20, 1999, Pittsburgh, Pennsylvania, USA


Related Jobs

TapZen
TapZen — Los Angeles, California, United States
[04.18.14]

Database Engineer
Fun Bits Interactive
Fun Bits Interactive — SEATTLE, Washington, United States
[04.18.14]

Senior Engine Programmer
2K
2K — Novato, California, United States
[04.18.14]

DevOps Engineer – Core Tech
2K
2K — Novato, California, United States
[04.18.14]

Senior Graphics Engineer - Core Tech Team






Comments


Nicholas Patrick
profile image
This article hits the nail on the head. It's not enough for hardware developers to address this problem by optimizing latency. There must also be clever inventions, like the menu jiggle shown here, to bridge the gap and make extended gaming possible on HMDs.

That or drugs. Both are strong options.

Matthew Owen
profile image
I am very interested in whether the much more accurate head tracking and the projection-based visual system used in the CastAR will be better for preventing motion sickness. The reports by people who have used that system at Maker Fair and other events seem to indicate it is much less of a problem.

Paul Tozour
profile image
I've tried both. I agree that CastAR is much less likely to introduce motion sickness due to the better tracking (absolute head tracking using an external sensor vs. relative head-tracking from the headset only) and the relatively smaller FOV when you use it in AR mode.

Still, Oculus is awesome and I've personally experienced almost no motion sickness with it either, so I might not just be very susceptible to it.

Mike Griffin
profile image
I was really happy to discover I'm not prone to motion sickness using the Rift, at least in the two in-development titles (quick VR hacks, incidentally) I experienced.

I've never had a problem with motion sickness in any game.

On the other hand, I've encountered a number of people over the years that experience pronounced VIMS problems in first person shooters, typically fast-paced titles on PC. Sometimes they can't even play a game for 10 minutes before beginning to feel sick.

I would be very curious to see how transferable to VR, per user, this 'classic' FPS motion sickness is. I.e., whether someone who is already plagued by existing VIMS symptoms is guaranteed to suffer similar or worse effects under VR.

Chris Dias
profile image
I think the most obvious solution would be to turn 2D HUD elements into translucent frustums matching the FOV of the video that extend to infinity. It'd probably look muddy, but it might work with HL2's HUD.

Luke Groeninger
profile image
Can you elaborate on that?

My gut instinct is to say that it's pretty close to what's already happening (2D HUD elements typically are rendered using the same view frustum as the rest of the scene - they wind up constrained to a narrower FOV for usability reasons), but I haven't had my coffee yet and I'm not sure I'm understanding the idea correctly.


none
 
Comment: