Gamasutra: The Art & Business of Making Gamesspacer
Hot Failure: Tuning Gameplay With Simple Player Metrics
arrowPress Releases
October 31, 2014
PR Newswire
View All

If you enjoy reading this site, you might also want to check out these UBM Tech sites:

Hot Failure: Tuning Gameplay With Simple Player Metrics

December 16, 2010 Article Start Page 1 of 4 Next

[In this article taken from Game Developer magazine's September 2010 issue, Google game developer advocate Chris Pruett describes how he quickly and cheaply implemented useful metrics into his Android game, Replica Island.]

There's nothing like watching somebody else play your game. Over the course of development, you've played the game daily, and have, perhaps unconsciously, developed a particular play style. But putting your work into the hands of a novice gives you a chance to see what happens to your design when it's played without the benefit of daily practice.

Every collision pop, animation snap, confusing tutorial message, and intermittent bug seems amplified when a beginner plays. No matter how much you polish or how many bugs you fix, your play style and intimate familiarity with the content can bias you away from problems that other users will immediately encounter.

That is why playtesting is a vital part of making a good game. In order to truly get the most from playtesting, you're going to have to take some data from these sessions -- this article chronicles my experience with gathering gameplay metrics.

Starting Simple

I got my start in the industry writing Game Boy Advance games. Back then, our idea of playtesting was pretty straightforward: we would get some local kids to come in, hand them a special GBA that was hooked up to a VCR, let them play for a bit, and then go back and review the tapes. This procedure yielded immediate, dramatic bugs.

Areas that the team took for granted were often sources of tremendous frustration for our testers. When a member of the target audience fails continuously in a specific area, it is usually a clear message that something needs to be fixed. A couple iterations with real live kids and the side-scrollers we were making would be vastly improved.

Nowadays, I work on and advocate games for Android phones. My first Android game, Replica Island, is a side-scroller, not so different from the GBA games I was making 10 years ago. But some things have changed: I'm no longer working for a game studio; I wrote Replica Island on my own, with the help of a single artist, mostly on my free time.

I also no longer have access to a pool of young playtesters, and even if I did, my target audience is a bit older. Finally, there's no easy way to record the output of a phone while somebody is playing -- the only way to really see what's going on is to stand over their shoulder, which is awkward and can influence the way the tester plays.

What is an indie phone game developer to do? As I reached feature completeness for Replica Island, I realized that I really had no way to guarantee that it was any fun. The game had been developed in a vacuum, and I needed to get more eyes on it before I could feel confident releasing it.

The first thing I tried was user surveys. I put the game up on an internal page at work and sent out an email asking folks to play it and give me feedback. I even set up a feedback forum with a few questions about the game.

This approach was pretty much a complete failure; though many people downloaded the game, very few (less than 1 percent) bothered to fill out my five question survey. Those who did fill out the survey often didn't provide enough information; it's pretty hard to tell if "game is too hard" indicates a failure in the player controls, or the level design, or the puzzle design, or the tutorial levels, or what.

Thinking About Metrics

After that setback, I remembered reading about the player metrics system Naughty Dog developed for the original Crash Bandicoot. The system wrote statistics about play to the memory card, which could then be aggregated offline to find areas that took too long or had a high number of player deaths.

These problematic areas were reworked, and the data was also used to tune the dynamic difficulty adjustment system in that game. One of the most interesting principles that fed into the design of this system was Naughty Dog's idea that the game over screen must be avoided at all costs. Their end goal was to remove "shelf moments," moments in which the player got stuck and could not continue.

I thought this was a pretty cool idea, but I wasn't sure how feasible it would be on a phone. I asked around a bit to see what the current state of metrics recording is on big-budget games, and found that many companies have some way to report statistics about player actions. Several people told me that while they collect a lot of information, they have trouble parsing that data into results that suggest specific design changes.

On the other hand, some studios have tools that can recreate a player's path through a level, and produce statistics about which weapons users prefer, which enemies are particularly tough, and which parts of the map are particularly visible. It seems that collection of player metrics is applicable to a wide variety of games, but that it only benefits the studios who also take significant time to build tools to crunch all the data that they collect.

(For an example of how this kind of system can be taken to the extreme, see Georg Zoeller's talk about the crazy system they have at BioWare.) It turns out that collecting the data is the easy part -- rendering it in a way that is useful for designers is much harder.

That sounded discouraging, as my goal was to keep my tool chain as simple as possible. But I decided to experiment with some metrics recording anyway, starting with just a few key metrics. My Android phone didn't have a memory card, but it did have a persistent internet connection. Maybe, I thought, I could log a few important events, send them to a server, and get results from players that way. My goal was to try to understand as much as possible about my players while keeping the system as simple as possible.

Article Start Page 1 of 4 Next

Related Jobs

Sega Networks Inc.
Sega Networks Inc. — Madison, Wisconsin, United States

Mobile Game Engineer
Forio — San Francisco, California, United States

Web Application Developer Team Lead
The Workshop
The Workshop — Marina del Rey, California, United States

InnoGames GmbH
InnoGames GmbH — Hamburg, Germany

Mobile Developer C++ (m/f)


Carl Chavez
profile image
Good article! Thanks a lot.

Simon T
profile image
Excellent article, very informative.

Just wondering, how many testers were in your test group?

Robert Boyd
profile image
Really interesting article.

Chris's blog on horror games & design is one of my favorite sites as well. Very informative.

Paopao Saul
profile image
A very informative read! I love articles that go in-depth with their technical details. Thanks Chris P.

Maurício Gomes
profile image
Whoa, cool article!

I have the same problem in my game, the testers play but never report back...

I only wonder what data I need to collect :/

Mark Venturelli
profile image
Very good insight on low-budget data gathering, but as your example proves, *nothing* replaces good old direct observation. If you just analyze data without actually seeing how people are playing the game, you are bound to make some misguided design decisions.

And when I say "people playing the game" it's not only the gameplay footage, but the actual person, their facial expressions, button presses, and so on. Just gather some friends who never played it before and you're golden (as long as you keep your mouth shut during the session).

Caylo Gypsyblood
profile image

Eric Schwarz
profile image
Excellent and detailed article. Using heat maps sounds a lot like what Epic did when developing Gears of War multiplayer levels - but rather than try to reduce deaths, they used them to make sure players were using the majority of the space and taking advantage of certain weapons and tactical options effectively.

I remember when I was designing a single-player mod campaign for Crysis, I got my friend to have a look at it. There was one specific place where the player was given a sniper rifle overlooking a village, with the intention that he or she use it to take out the guards below. The problem, of course, was that neither the cliff overlook or the sniper rifle were very visible, so I spent a huge amount of time just toying with the vegetation placement, lighting, and the approach to make the overlook that much more apparent, and more visible than the side path down. It's the kind of thing I simply wouldn't have known to fix without just one extra pair of eyes on it.

Manuel Mestre-Valdes
profile image
I did play "Replica Island" on my phone a few months ago. I was oblivious that all this data gathering was taking place, but I think it's a brilliant idea to improve the game.

I liked the level design, the clever use of the HTC Hero trackball, the open source origins and the overall graphical and design quality of the game.

The problem for me is that I don't think it's a genre suited for a smartphone. The use of trackball was clever, but it can't compare to the feel of buttons.

Jordan Lynn
profile image
Back end data is extremely useful and inexpensive (when implemented like you did it), but it can only tell you who did what, when, and where.

Direct observation is more time and resource heavy, but is the only way to answer any question that starts with "Why...?"

The best approach? Do both, using the former to identify problem areas, then observing new players playing those sections to get detailed player information.