[In this reprinted #altdevblogaday-opinion piece, freelance game designer Emmeline Dobson explains why developers need to do usability testing, and offers advice on questioning participants and listening to their feedback.]
Usability testing is on the increase, led by companies like PlayableGames and Vertical Slice in the UK, and Microsoft Games Studios having offered its gameplay lab services to its exclusive developers for Xbox from before 20031
. Yet it can be regarded as expensive, troublesome to organise, and just a distraction from making features for a game product.
It doesn't have to be.
Why do usability testing?
Steve Krug's book Don't Make Me Think! is about User Interface design, but it's also heavily about cheap, early, and frequent usability testing. This is because he advocates it as a key part of the most reliable method to improve a product in development.
This model for design could be adopted more in game dev and allows usability problems to be identified much earlier in the process of development2
. The idea is the same as for games QA testing in general - bugs found earlier are much cheaper to fix than bugs found later.
Using a design cycle that incorporates testing with the target audience as a key step also encourages a design spirit more in touch with the end-user - and real end-users at that, rather than a dev team that "makes games for ourselves" where members ask each other to test features casually.
Skinny usability testing
Adapted from Hussein's Developing eLearning Materials
A usability testing lab looks something like this:
Based on the labs at PlayableGames
The lab environment is great for trying to make the test subject feel at ease and play a game in the way that they would at home as much as possible. It's also set-up for collecting as much data as possible from the session with multiple camera views of the tester and footage of button-presses as well as on-screen action.
But what if you don't have pockets like Valve, aren't bosom-buddies with Microsoft, and your executive producer doesn't want to spare the budget for usability testing until well after Alpha? When making Dragonology
for Wii at Kuju Entertainment's kids games studio, we knew we wouldn't understand the target market of 7-12 year-old boys and girls as well as possible unless we actually brought in some real kids to test the game!
Much of the value of a usability test can be captured simply through key members of the dev team being able to watch a user play an early build of the game. In our studio, a handful of members of staff were able to bring in their own kids for a morning or afternoon, and the design team and others were able to watch them play early versions of the game, listen to what they said, watch what they did, and ask them questions. It wasn't a scientifically significant number of testers, but it quickly recalibrated our ideas in key areas such as, "What level of difficulty is appropriate for kids this age?", "Is the story of interest to them?", and "How long does their attention-span stretch?"3
Facilitating the session - questioning and listening
Not asking too many questions was a good strategy, as the idea is for testers to be as relaxed as possible so they will express more spontaneously what they are experiencing. Hopefully the experience can be closer to what they'd feel at home, as being watched and probed for information is weird.
I found that we had a livelier and more natural test session when one of the artists brought in his two daughters, aged 8 and 6. Dad was also not far away, so the sense of it being an artificial environment was much-lessened. We watched them asking each other for help and making suggestions to the other when it wasn't their turn, which did the job of a facilitator for us in a sense. One of the girls got so excited she waved the Wii Remote over her shoulder in order to try to get the dragon to turn around 180o
. Another time, the one viewing encouraged the other, "Try to land on that chimney!" By contrast, solo user testing sessions felt more awkward and staged.
The right kind of questioning and preparing an agenda for areas of the design you want to have light shed upon by the session is important. Prepare some open-ended questions - ones that elicit more than yes / no answers. Follow what the tester believes their in-game goal is, and what they think they need to do to achieve that. If they're confused, that's a usability issue to review. Test that HUD and menu elements and in-game objects are getting the message across - ask the tester what they think they mean, what they expect when they interact with them, and why they think they might do that.
Listen to your usability testers and empathise with what they are saying. Spontaneous responses are valuable; for example one 14-year-old boy gasped with genuine surprise at part of the game that was working well. It also revealed my own bias; I'm a strong believer that "gameplay rules!" But I was surprised to hear from this 14-year-old that he watches every cutscene and pays attention to the dialogue because it's the story in games that draws him in. I hope I cater more widely for different kinds of player like this user tester in future.
Issues found during testing & what we changed
Here is probably the most startling example of a bug we found during user testing is illustrated in this 50-second video
. We found this bug quite near the end of the project, as the tutorial it came from went in later.
However, because we did early usability testing before Alpha, we were able to make sweeping improvements our game while we could still make big decisions. About a quarter of our game's missions were linear race courses through the sky against a timer. A 12-year-old female tester repeatedly didn't complete these and we were able to adjust the difficulty accordingly. But she also said she didn't understand the timer when we asked her what she thought it was; we hadn't explained it.
Combined with seeing other testers wanting to explore the world rather than being forced to play the game only the way we had designed, we decided to put in a free-flying exploration mode. We also took out the punishment of returning to the start to try again for the linear race missions and instead recorded the player's best times and gave the player easy / medium / hard medals for their performance.
How to make more out of usability testing next time
Small, cheap HD video cameras like FlipCams mean it is now very easy to capture footage of user testing sessions and share them (e.g. via a studio wiki). We only had a limited number of staff see our usability test subjects playing the Dragonology buil
d, but I heard anecdotally (in another studio) that it was a real eye-opener for senior staff - lead programmer, art director and lead level designer - to go to an off-site usability lab session. A usability test session can be a real wake-up call to prioritize the most important fixes for your game. It is also a potential morale-boost for developers to see that the game they have been plugging away at for months and months is starting to sparkle in the eyes of the players as hoped!
Further issues raised by the experience
Game difficulty is an interesting area to prepare for getting feedback on as users may say something is "too easy", "too hard", "boring" or "confusing", but understanding the reasons for these comments requires discernment. I might prepare questions such as, "Do you know if you're progressing?" "How can you tell?" "Has the game rewarded you as you deserve?" "What goal do you want to achieve next?" As mentioned earlier, I want to ask open-ended questions, not yes / no ones.
In the future, I would like to bring a more nuanced understanding of different kinds of fun in games to the user testing stage. The experience of observing regular people playing games helps me see the importance of curiosity for gameplay, but I wonder if we were getting feedback about the linear race courses related to improving their exploring-fun value instead of criticising the challenge-fun we hoped to offer in our gameplay design?4
I also wonder if desire to explore the game's affordances, not merely the environment, is expressed in ideas like, "Try to land on that chimney!"
Finally, the experience gave me a sense of affirmation that there is intrinsic value in game dev; we make products that delight and inspire. It called me to a higher standard of quality in the design work that I do by putting me in touch with my audience more closely. I'd like to do more in future (actually I enjoy the chances I get to watch friends, strangers and relatives play games when I can!) and champion making it more commonly part of the game development process as it seems to be in other design fields.
Usability testing has been and is a part of the process for making great games including Portal
and other titles, yet is not so widespread in game dev. The same reasons why it is good practice to conduct QA testing early and frequently apply to usability testing, and there are perhaps some issues that could only come to light when real users are put in front of the build, not from regular testing with seasoned full-time staff who have seen the game over-and-over. It doesn't need to break the bank, and it is worth preparing for intelligently to get the most out of it. Arguably, it also gets the process of game design more connected to the audience - where it needs to be!
Have you tried or will you try out in-house usability testing? What practical issues with conducting in-house usability testing as described should also be considered? What effects do different tweaks have on whether something is seen as "easy" or "hard", "boring" or "confusing"?
For educators - I ran about three informal usability sessions with 16-19-year-old Media Production / Game Dev students and they were lively, interactive, and vocationally-relevant classes. Students can act as facilitators and take turns to ask questions of a guest usability tester and the class can discuss afterwards what surprised them and what they would change in the game to overcome problems that were highlighted by the session.
Krug, S 2005 Don't Make Me Think! A Common Sense Approach to Web Usability : New Riders
Hussein, S 2005 Developing E-Learning Materials: Applying User-Centred Design : NIACE
Lazarro, N 2004 Four Keys to Fun : http://xeodesign.com/4k2f/4k2f.jpg
Lemoore, H - "Rocknor's Bad Day" playtest insights, 2007 : http://hanfordlemoore.com/v/dont-do-what-your-users-say
Marmura, R - Usability testing with paper prototypes article at Game Career Guide, 2008 : http://gamecareerguide.com/features/622/paperprototyping5factsfor_.php?page=1
Valve's Chet Faliszek: Playtesters Aren't Idiots, It's You, 2008 : http://www.gamasutra.com/php-bin/news_index.php?story=19523
Massive WIRED article on Halo 3 testing : http://www.wired.com/gaming/virtualworlds/magazine/15-09/ff_halo
Based on viewing Kung Fu Chaos usability lab footage from Microsoft.
See "Valve's Chet Faliszek: Playtesters Aren't Idiots, It's You"
Not entirely incidentally, these are key issues for teachers, too, but those similarities are for a possible future blog entry.
This is bearing in mind Nicole Lazarro's work. Specifically discriminating between easy fun
and hard fun.
[This piece was reprinted from #AltDevBlogADay, a shared blog initiative started by @mike_acton devoted to giving game developers of all disciplines a place to motivate each other to write regularly about their personal game development passions.]