Gamasutra: The Art & Business of Making Gamesspacer
Hot Failure: Tuning Gameplay With Simple Player Metrics
arrowPress Releases
November 28, 2014
PR Newswire
View All






If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 
Hot Failure: Tuning Gameplay With Simple Player Metrics

December 16, 2010 Article Start Page 1 of 4 Next
 

[In this article taken from Game Developer magazine's September 2010 issue, Google game developer advocate Chris Pruett describes how he quickly and cheaply implemented useful metrics into his Android game, Replica Island.]

There's nothing like watching somebody else play your game. Over the course of development, you've played the game daily, and have, perhaps unconsciously, developed a particular play style. But putting your work into the hands of a novice gives you a chance to see what happens to your design when it's played without the benefit of daily practice.

Every collision pop, animation snap, confusing tutorial message, and intermittent bug seems amplified when a beginner plays. No matter how much you polish or how many bugs you fix, your play style and intimate familiarity with the content can bias you away from problems that other users will immediately encounter.

That is why playtesting is a vital part of making a good game. In order to truly get the most from playtesting, you're going to have to take some data from these sessions -- this article chronicles my experience with gathering gameplay metrics.

Starting Simple

I got my start in the industry writing Game Boy Advance games. Back then, our idea of playtesting was pretty straightforward: we would get some local kids to come in, hand them a special GBA that was hooked up to a VCR, let them play for a bit, and then go back and review the tapes. This procedure yielded immediate, dramatic bugs.

Areas that the team took for granted were often sources of tremendous frustration for our testers. When a member of the target audience fails continuously in a specific area, it is usually a clear message that something needs to be fixed. A couple iterations with real live kids and the side-scrollers we were making would be vastly improved.

Nowadays, I work on and advocate games for Android phones. My first Android game, Replica Island, is a side-scroller, not so different from the GBA games I was making 10 years ago. But some things have changed: I'm no longer working for a game studio; I wrote Replica Island on my own, with the help of a single artist, mostly on my free time.

I also no longer have access to a pool of young playtesters, and even if I did, my target audience is a bit older. Finally, there's no easy way to record the output of a phone while somebody is playing -- the only way to really see what's going on is to stand over their shoulder, which is awkward and can influence the way the tester plays.

What is an indie phone game developer to do? As I reached feature completeness for Replica Island, I realized that I really had no way to guarantee that it was any fun. The game had been developed in a vacuum, and I needed to get more eyes on it before I could feel confident releasing it.

The first thing I tried was user surveys. I put the game up on an internal page at work and sent out an email asking folks to play it and give me feedback. I even set up a feedback forum with a few questions about the game.

This approach was pretty much a complete failure; though many people downloaded the game, very few (less than 1 percent) bothered to fill out my five question survey. Those who did fill out the survey often didn't provide enough information; it's pretty hard to tell if "game is too hard" indicates a failure in the player controls, or the level design, or the puzzle design, or the tutorial levels, or what.

Thinking About Metrics

After that setback, I remembered reading about the player metrics system Naughty Dog developed for the original Crash Bandicoot. The system wrote statistics about play to the memory card, which could then be aggregated offline to find areas that took too long or had a high number of player deaths.

These problematic areas were reworked, and the data was also used to tune the dynamic difficulty adjustment system in that game. One of the most interesting principles that fed into the design of this system was Naughty Dog's idea that the game over screen must be avoided at all costs. Their end goal was to remove "shelf moments," moments in which the player got stuck and could not continue.

I thought this was a pretty cool idea, but I wasn't sure how feasible it would be on a phone. I asked around a bit to see what the current state of metrics recording is on big-budget games, and found that many companies have some way to report statistics about player actions. Several people told me that while they collect a lot of information, they have trouble parsing that data into results that suggest specific design changes.

On the other hand, some studios have tools that can recreate a player's path through a level, and produce statistics about which weapons users prefer, which enemies are particularly tough, and which parts of the map are particularly visible. It seems that collection of player metrics is applicable to a wide variety of games, but that it only benefits the studios who also take significant time to build tools to crunch all the data that they collect.

(For an example of how this kind of system can be taken to the extreme, see Georg Zoeller's talk about the crazy system they have at BioWare.) It turns out that collecting the data is the easy part -- rendering it in a way that is useful for designers is much harder.

That sounded discouraging, as my goal was to keep my tool chain as simple as possible. But I decided to experiment with some metrics recording anyway, starting with just a few key metrics. My Android phone didn't have a memory card, but it did have a persistent internet connection. Maybe, I thought, I could log a few important events, send them to a server, and get results from players that way. My goal was to try to understand as much as possible about my players while keeping the system as simple as possible.


Article Start Page 1 of 4 Next

Related Jobs

DeNA
DeNA — San Francisco, California, United States
[11.28.14]

Senior Build and Release Engineer
Filament Games LLC
Filament Games LLC — Madison, Wisconsin, United States
[11.28.14]

Game Engineer
The Workshop
The Workshop — Marina del Rey, California, United States
[11.28.14]

Programmer
InnoGames GmbH
InnoGames GmbH — Hamburg, Germany
[11.28.14]

Mobile Developer C++ (m/f)





Loading Comments

loader image