Valve has been experimenting with biofeedback in video games, exploring how a player's physiological signals can be used to determine their emotional state, and in turn potentially affect the game design.
Talking at the NeuroGaming Conference last week, and as reported by Venturebeat, Valve experimental psychologist Mike Ambinder explained that biofeedback can be utilized in video games in various ways.
"There is potential on both sides of the equation," he noted, "both for using physiological signals to quantify an emotional state while people are playing the game, and getting an idea of how people are emotionally experiencing your game -- in a somewhat more granular fashion that you can get from watching them, or hearing them think aloud, or asking them questions afterwards."
Ambinder says that, while a mouse, keyboard or gamepad are great for controlling video game experiences and providing the game with feedback from the player, games cannot currently tap into a player's emotional state and make use of this data.
With this in mind, Valve has run a series of experiments using its first-person zombie shooter Left4Dead, in which the game would take the pH content of a player's sweat, and make the game more difficult and tense if a player was aroused or nervous.
The idea behind the experiment was to make the player question: "How nervous am I? How is that impacting my future performance?"
The company has also experimented with eye-tracking in games, adding the ability to control Portal 2 with eye movement.
"The eyes are quicker," reasons Ambinder, "so if you're creating a game where accuracy and speed of movement are incredibly important, you can imagine the eyes in theory being a more appealing case."
He added, "We'd love to see more game developers thinking along these lines, and trying to see what's possible when you actually incorporate this whole other access of player experience that is being ignored by traditional control schemes."
Valve co-founder Gabe Newell has discussed the potential for utilizing biofeedback and gaze-tracking in video games before, stating, "We're a lot more excited about biometrics as an input method. Motion just seems to be a way of [thinking] of your body as a set of communication channels. Your hands, and your wrist muscles, and your fingers are actually your highest bandwidth -- so to try to talk to a game with your arms is essentially saying, 'Oh, we're going to stop using Ethernet and go back to 300 baud dial-up.'"
It's a thin line between "great idea" and "useless gimmick", and those who fall on the correct side of it reap the millions. Personally, I can see some value in biofeedback for testing during development, but I don't really see it as a valuable input method. Perhaps I'm just short-sighted, but then again, the Wii Vitality didn't exactly set the world on fire.
I think it is a great idea. I wrote a paper on using pulse as a way to determine AI response to player heart rate. This would be things like enemies move faster, they can detect you, or become tougher. Imagine if while playing a horror game like "Amnesia" you had to not only hide from the monster chasing you but also keep yourself calm in an a situation that commonly causes anxiety and increased heart rate. I can understand where this could be considered "gimmicky" but where else are we heading with games? Virtual reality is on its way so why not biometrics?
This assumes that we all have relatively the same ability to control our various bodily functions. A controller levels the playing field somewhat because of its limited range on input but what if you have high or low blood pressure? What if you are an athlete and have a lower BPM than the average person (even during stress)?
This brings us back to the whole depth/complexity balance. If Bio-input adds unnecessary complexity to how the game controls then it creates a poor user experience. Since a lot of our functions may not be voluntary, how about allowing it to control non-critical game systems?
yeah, taking this a few steps further: what about when players start gaming the system by taking too many redbulls or too many tylenol 3s to trick the heartbeat monitor in their favor?
Games do already assume we all have relatively similar ability to control our body. I have a disability known as ms, this affects everything from my memory, eyes, hands and fingers, legs, swallow reflex etc.
Unyet I play games designed for people who are "normal". I have to learn to adapt my play style and ability, based on my own skill set and the challenges my disability presents, simply using the games "normal user" state as a benchmark of how to adjust the controls.
Whenever you create entertainment for masses you have to start from a baseline of what presumably most of your customer audience considers "normal". I see no reason why this couldn't also be the case for biometric feedback, or better yet, have a calibration mechanic within the game for the individual user. Much like we already manually adjust the gamma or graphics settings in games.
Or if someone walks in the room and talks to the player, or the dog next door starts barking, or the player is turning to the game to blow off steam from an argument. Players cannot be assumed to be gaming in pristine environments with complete isolation.
While I would agree with this argument, the same could be said for all online gaming. System performance, band width, peripheral choice, real world distractions and personal circumstance have always been a factor in online gaming competitiveness, however it hasn't stopped most people from embracing it anyway.
In a simple scenario like an online card game, some kind of bio-feedback might be able to fill in some of the 'body language' cues you get in a face to face scenario. Learning to control feedback might actually become a gaming skill in its own right. And breathe...
I wonder if any research has been done into "gamer body language". I don't mean biometrics, I mean trying to determine a player's mood from how they are playing. For example in Left 4 Dead, if I start spraying everywhere I'm probably panicked or in Team Fortress 2, where an experienced player can spot a Spy just from the subtleties of their movement without really knowing why.
This brings us back to the whole depth/complexity balance. If Bio-input adds unnecessary complexity to how the game controls then it creates a poor user experience. Since a lot of our functions may not be voluntary, how about allowing it to control non-critical game systems?
Unyet I play games designed for people who are "normal". I have to learn to adapt my play style and ability, based on my own skill set and the challenges my disability presents, simply using the games "normal user" state as a benchmark of how to adjust the controls.
Whenever you create entertainment for masses you have to start from a baseline of what presumably most of your customer audience considers "normal". I see no reason why this couldn't also be the case for biometric feedback, or better yet, have a calibration mechanic within the game for the individual user. Much like we already manually adjust the gamma or graphics settings in games.