Roughly a year ago, Ubisoft announced it had a VR game in development that was all about putting players behind the eyes of eagles in flight: Eagle Flight.
It's been an intriguing project for VR devs to follow, because it's such a tricky thing to make a VR game that doesn't make players feel ill -- especially when that game asks them to move around a virtual world in an utterly inhuman way, guiding the bird's flight path by moving their head around.
When Eagle Flight lead Olivier Palmieri spoke to Gamasutra earlier this year, he seemed unfazed by the challenge. He was ebullient about the prospect of working on a game about flying and excited about the notion of games that can be controlled primarily by head movements. "It can be very accessible, very simple," he said. "It could be the next Wii."
Today, Palmieri took the stage at VRDC in San Francisco to talk in more depth about the Eagle Flight dev process and what he's learned about designing comfortable VR games. It's notable where it starts: Palmieri and his team focusing not on what they wanted to make for VR, but on figuring out what sorts of experiences are best-suited for VR headsets.
“We wanted to figure out what we could do best, what we could use best, in VR,” says Palmieri. “What can VR bring us? What is the best way to use VR? What is the game or experience that would be unique in VR, and show the strengths of the medium?
The Eagle Flight team started as a Ubisoft Montreal R&D project in 2014, as a small team started testing and prototyping different game ideas using Oculus Rift DK2 dev kits.
Take time to read up on scientific research into how and why humans get sick
The inner ear was of prime interest to the Eagle Flight team, as was the saccadic eyes motion -- the way human eyes move around while tracking an object. Palmieri recommends fellow VR devs read up on this stuff, as well as the vestibulo-ocular reflex -- the way our eyes are stimulated to move in a certain way when our head moves, in order to maintain our equilbrium.
To try and minimize player motion sickness in VR, Palmieri says he and his team looked at NASA’s work on stroboscopic treatments, which try to ameliorate space sickness in astronauts via messing with the light.
“Based on all that research...we developed our first prototypes,” says Palmieri. The focus of Ubisoft Montreal’s VR prototyping effort was threefold: it had to be comfortable, it had to be intuitive and accessible for a broad audience, and it had to feel precise and reactive. One of the first prototypes they came up with was Inside Notre Dame, a VR experience that allowed people to move and look around inside Paris’ Notre Dame cathedral.
“It was a learning experience,” says Palmieri. The next big prototype was Eagle Flight, in part because -- according to Palmieri -- it was a good fit for the hardware and easy to make using cityscape assets Ubisoft Montreal already had access to.
Next, the team took a shot at demoing the prototype at places like E3 and Gamescom. This was critical, according to Palmieri, but not necessarily because of the opportunity for exposure -- rather, it was key because it gave the team to watch a broad audience play the prototype.
“Playtest, playtest, playtest,” says Palmieri. “We tried to playtest with everybody -- gamers, non-gamers -- to try and get the controls right.”
A big point of concern was the way Eagle Flight asks the player to control their flight using head movements. Would the human head be an awkward controller, or would using it that way make people sick?
“Using your head as a controller went beyond our expectations,” says Palmieri. “You don’t need to interface with your arm, or learn to control a joystick, or anything like that. Because it’s all in your head….there’s a very short path, so it’s very reactive. I believe it can be one of the most precise and reactive controls you can have in video games.”
Avoiding common nausea triggers in VR
Palmieri says one of the greatest risk factors, in terms of making players sick, is when a player’s inner ear conflicts with what their eyes are seeing. This triggers the brain to be nauseous, says Palmieri, as a defense mechanism against toxins that are presumably causing the disconnect.
As most VR devs probably already know, there are two good ways to cause eye/inner ear conflight: a player feeling motion but not seeing it (think: boat with no windows) and a player seeing motion, but not feeling it (if you were floating in space, for example.)
Other great ways to make your players sick include fast motion while they’re very close to something (so like, zooming past a wall) and, surprisingly, pushing them through virtual walls in your game.
“Surprisingly, maybe, if you go through virtual walls, the brain doesn’t like that,” says Palmieri. “When you go against an obstacle, your brain expects a reaction….and the brain is not very comfortable with that.”
What is comfortable in VR, according to Palmieri, is constant linear forward movement. Players feel most comfortable moving in VR when they’re moving straight ahead in line with their vision, he says -- hence Eagle Flight.
He says continuous motion is ideal, and when the player has to change speeds, gradual change is best -- instant stops or speed boosts are bad news.
“So for example, in our game Eagle Flight, there is acceleration...but it is very controlled acceleration, “ says Palmieri. “ and we work in effects to give the perception of acceleration, and to give the brain something to focus on.”
So what if your player is going really fast -- and runs into a wall? If you can’t stop them suddenly and can’t let them clip through the wall, what do you do?
“The solution we came up with in Eagle Flight was to quickly fade to black upon collision,” says Palmieri. “The brain can’t compare the collision between the inner ear and the eye because it can’t see anything.”
Devs should know that when designing that fade-to-black, Palmieri’s team made sure to smoothly trainsition the “speed particles” away (instead of just removing them immediately) and designed the text which fades up when you die to fade up at a delay, so that the player doesn’t hit something and immediately see non-moving text.
Don't overlook the value of a good nose
Also, Palmieri recommends that VR devs don’t overlook what’s right in front of their faces: the nose.
“When you put a VR headset on, you don’t see your nose anymore, and that’s hard for your brain,” says Palmieri. “That’s one of the reasons why we added the beak to Eagle Flight -- it brings your nose back into the equation.”
It’s a fixed reference point, so it gives players a thing to anchor their perspective of the world on. Palmieri says you can also reduce the changes your players get sick by adding in haptic feedback, aural feedback (the sound of whistling wind and flapping wings, for example) and generally just creating a game with “engaging gameplay” -- systems that players’ brains can engage with intensely enough to mute the discomfort of being in VR.
Eagle Flight is also known for having a dynamic vision-blocking system -- as the player flies arund the world, blackness creeps in around the edges of their vision to help them avoid nausea. During Eagle Flight’s development, Palmieri says he and his team spent a lot of time studying how human eyes work -- and how they compare to the eyes of predator and prey animals.
They found that human peripheral vision is the part most likely to notice motion, and since motion is a big factor in making people sick, they developed ways for Eagle Flight to obscure peripheral vision during high-risk maneuvers -- fast turns, quick movement close to large objects, and the like.
“We are trying to hide this part of the view, and so in a way we are hiding the peripheral vision,” says Palmieri. “It helps because you don’t show so much motion there, so the brain accepts it.”
In closing, Palmieri shared his top three takeaways from developing Eagle Flight with fellow devs: