Deep Dive is an ongoing Gamasutra series with the goal of shedding light on specific design, art, or technical features within a video game, in order to show how seemingly simple, fundamental design decisions aren't really that simple at all.
Check out earlier installments, including using a real human skull for the audio of Inside, the challenge of creating a VR FPS in Space Pirate Trainer, and creating believable crowds in Planet Coaster.
I’m Olivier Palmieri, and I'm the Game Director on Eagle Flight at Ubisoft Montreal. I've been working at Ubisoft since 1998, in several Ubisoft studios around the world. I had the chance to work on many Ubisoft brands, such as Assassin's Creed, Far Cry, Splinter Cell and Rayman.
I studied electronics and computer science at school. I personally love science. This love of science helped me with the creation of our control system in Eagle Flight, where we focused on trying to deliver a very comfortable experience, but also intuitive and precise controls.
After working on Far Cry 4, I wanted to dive into studying VR and create experiences for it. We regrouped as a small team of VR enthusiasts to explore this new medium and embark on a great adventure.
We started the Eagle Flight project first as a research & development study on how to make the most of VR technology.
Before creating a game, we wanted to study this new medium, and ensure we fully understood the hardware.
Our focus was first on comfort. We knew that motion is a challenge in VR, but we were convinced that there were solutions to have full motions, while remaining comfortable.
We also focused on trying to make the controls very intuitive, so everybody could pick-up the game and quickly know how to play. We wanted to be able to share Eagle Flight with as many people as possible, gamers and non-gamers alike.
Finally, because we believe in the ethos of, "easy to pick up, but hard to master", we wanted controls that allowed the player to be very precise and reactive, so we could provide expert challenges, too.
As a science lover, I wanted to study the reasons why there could be discomfort in VR.
Testing existing VR experiences back in 2014, we saw that many of these made the choice to have static experiences or teleporting mechanics to avoid VR discomfort.
While studying the human biological reasons behind VR discomfort, we found that it was mainly linked to the disconnection between the visual perception system (what the eyes see) and the sense of balance and acceleration (perceived by the inner ear). Motion sickness is a brain defence mechanism against neurotoxins that is useful to prevent the body from getting poisoned and triggering nausea in case of detection of some signs from the brain: inconsistency between what we see and what we sense.
We worked on controls and techniques that allowed us to reduce these inconsistencies. Early on, we decided to use the players’ head as the main control input, tracking the user's head orientation in the real world and mimicking it in Eagle Flight.
Following this research, we also created what we call ‘TILT control’ in Eagle Flight.
The main intention was to allow players to turn 360 degrees around in the virtual world without requiring them to turn their bodies in all directions. The TILT control also allows the user to play comfortably on a fixed seat, like on a couch.
Fun fact - we found out, at events, that the TILT control was very natural to actual plane pilots who played Eagle Flight, as it's a natural way to control a plane in real life too. It’s also true for flying animals to reorient themselves by tilting their wings/body.
Using the user's head to navigate in the world allowed us to reach a level of precision and reactivity beyond our initial expectations. It also allowed very instinctive reactions and micro-adjustments that we naturally do.
For example, when you see that you are approaching an obstacle at full speed, your natural reaction will be to duck or change direction to avoid that obstacle. As we are using these motions in a very reactive game, it allowed players to naturally avoid the obstacles instantly, and to even go into very tiny paths or passages, in a very intuitive and effective way.
In fact, not having to use the gamepad joysticks allows the decision process (brain) and the reaction process (head to control direction) to be much quicker than with a controller, which has two main benefits. First, the reaction time is as short as humanly possible. Secondly, you don't need to learn about the controls, learn how to manipulate the joysticks with your hands (sending the information from your brain to your fingers), or learn what the reaction in game will be depending on the game mechanics and physics. You simply look where you want to fly.
It allowed us to create very tricky and dexterous levels, such as the metro and catacombs, where you can challenge your skills. Eagle Flight's online leagues and leaderboard systems for each mission of the game allows players to compare their performances and dexterity.
In Eagle Flight, we chose the city of Paris as a setting for its very interesting natural playground, corresponding to the type of challenges and navigation we wanted for our game: large and narrow streets, irregular street patterns, parks and landmarks. Paris's landmarks (ex: Notre-Dame and the Eiffel Tower) also allows us to help players' orientation within the world, which allows them to make reference points no matter where they’re heading to.
In our research, we also noticed that humans constantly see our nose in our field of view. It helps us (and our brain) by giving a fixed reference point in our vision. Our nose is then removed by our brain in our perceived final vision.
With a VR headset, our eyes don’t see the nose anymore, and our brain is destabilised. In Eagle Flight, the beak allows us to bring back that simulated nose effect and give the required stabilisation to be comfortable.
We worked on many other techniques to help improve comfort. One of them is what we call our ‘Dynamic Blinders’.
Human peripheral vision is the most sensitive to detecting motion. That was especially useful when, in the early ages, we had to hunt for food, and be careful about the dangerous animals that could come from all around.
We decided to work on reducing the perception of a fast motion in the peripheral vision, but only when needed. Our dynamic blinders come into play when we detect fast motion specifically on the left, right, up or down sections of our vision. It helps hide dynamically fast motion in these specific areas, but only when we detect potentially problematic motion instead of creating a constant tunnel vision.
This dynamic reduction of the field of view is accepted by our brain, because it's a natural process that the brain can trigger to help us focus. For example, the tunnel vision effect is very common for formula one racing drivers, helping them to focus on the circuit and have fast reactions.
Throughout the game’s production, we organised play tests to help us work on, redefine and tweak our game (the controls and comfort solutions).
We also demoed Eagle Flight at many events, such as E3 2016 and Gamescom for example, allowing us to get lots of player feedback to our game and its control systems.
We've been very happy with the results we achieved, with our game being played and enjoyed by so many people.
We're having great results with comfort, as the vast majority of players are totally comfortable playing the game. This often comes as a surprise to them, as they were not expecting such comfort with full speed flight.
Finally, players are very impressed by the level of precision and reactivity they are achieving within the game.
We strongly believe that you need to try VR to understand its immersion, and that you need to try Eagle Flight to understand its level of precision and reactivity, its depth, and its level of comfort.
I hope many players will try Eagle Flight and learn to fly! See you in the skies of Paris!