GAME JOBS
Contents
Intro to User Analytics
 
 
Printer-Friendly VersionPrinter-Friendly Version
 
Latest Jobs
spacer View All     Post a Job     RSS spacer
 
June 6, 2013
 
Wargaming.net
Build Engineer
 
Gameloft - New York
Programmer
 
Wargaming.net
Build Engineer
 
Virdyne Technologies
Unity Programmer
 
Wargaming.net
Quality Assurance Analyst
 
Wargaming.net
Python Developer
spacer
Latest Blogs
spacer View All     Post     RSS spacer
 
June 6, 2013
 
Free to Play: A Call for Games Lacking Challenge
 
Cracking the Touchscreen Code
 
10 Business Law and Tax Law Steps to Improve the Chance of Crowdfunding Success
 
Deep Plaid Games, one year later
 
The Competition of Sportsmanship in Online Games
spacer
About
spacer Editor-In-Chief:
Kris Graft
Blog Director:
Christian Nutt
Senior Contributing Editor:
Brandon Sheffield
News Editors:
Mike Rose, Kris Ligman
Editors-At-Large:
Leigh Alexander, Chris Morris
Advertising:
Jennifer Sulik
Recruitment:
Gina Gross
Education:
Gillian Crowley
 
Contact Gamasutra
 
Report a Problem
 
Submit News
 
Comment Guidelines
 
Blogging Guidelines
Sponsor
Features
  Intro to User Analytics
by Anders Drachen, Alessandro Canossa, Magy Seif El-Nasr [Business/Marketing, Design, Game Developer Magazine, Console/PC, Social/Online, Smartphone/Tablet, GD Mag, GD Mag Exclusive]
3 comments Share on Twitter Share on Facebook RSS
 
 
May 30, 2013 Article Start Previous Page 3 of 6 Next
 

Integrating Analytics

Bias is introduced in the dataset both by the selection of the features to be monitored and also by the measuring strategies adopted, and that happens to a large degree when analysts work in a vacuum. If those responsible for analytics cannot communicate with all relevant stakeholders, critical information will invariably end up missing and the full value of analytics will not be realized.

Analytics groups are placed differently across companies due to analytics arriving to the industry from different directions, notably user research, marketing, and monetization, and this can lead to a situation where the analytics team only services or prioritizes their parent department. Having a strong lateral integration -- making sure that the analytics team communicates with all the teams, for example -- helps to avoid this issue. This also helps alleviate the common problem that the analytics teams, without having sufficient access to design teams, are forced to self-select features to track and analyze, without having the proper grounding in the design of the game and its monetization model.



Even for a small developer with a part-time analyst this can be a problem. Another typical problem is that the decision about which behaviors to track is made without involving the analytics team. This can lead to a lot of extra time spent later on trying to work with data that are not exactly what is needed, or needing to record additional datasets. Good communication between teams also helps alleviate friction between analytics and design.

Importantly, analytics should be integrated from the onset of a production -- all the way back in the early design phases. Early on it should be planned what kinds of behavior that should be tracked and with what types of frequencies. This allows for optimal planning of how to ensure value from analytics to design, monetization, marketing, etc. Analytics should never be slapped on sometime after the beta. In this way analytics is similar to other tools like user research, in that it ideally is embedded throughout the development processes, and after launch.

Feature Selection

Knowing that there is an array of things we can measure about user behavior, how do we then select among them? And do we really have to make choices here? Sadly, yes. In real life, we rarely have the resources to track and analyze all possible user behaviors, which means we have to develop an approach to analytics that considers cost-benefit relationships between the resources required for tracking, storing, and analyzing user telemetry/metrics on one hand, and the value of the insights obtained on the other. It is also important to be aware that the analyses needed during different stages of production and post-launch varies. For example, during the latter phases of development, tuning design is vital, but many metrics related to monetization cannot be calculated because purchases have not been made by the target audience yet.

We will discuss this in more detail below, but in short, by following this line of reasoning, the minimum set of user attributes that should be tracked, stored, and analyzed should include considerations as to the following:

1) General attributes: The attributes that are shared for users (as customers and players) across all games. These form the core metrics that can always be collected, for any computer game -- for example, the time at which a user starts or stops playing, a user ID, user IP, entry point, and so on. These form the core of any game analytics dataset.

2) Core mechanics/design attributes: The essential attributes related to the core of the gameplay and mechanics of the game. (For example, attributes related to time spent playing, virtual currency spent, number of opponents killed, and so on.) Defining the core design attributes should be based directly on the key gameplay mechanics of the game, and should provide information that lets designers make inferences about the user experience (whether players are progressing as planned, if flow is sustained, death ratios, level completions, point scores). 

 
Article Start Previous Page 3 of 6 Next
 
Top Stories

image
Keeping the simulation dream alive
image
A 15-year-old critique of the game industry that's still relevant today
image
Here's the first list of Unreal Engine 4 integrated middleware
image
The demo is dead, revisited
Comments

Taylor Stallman
profile image
Great article. After just graduating from college last December with a focus on database marketing. This article is spot on. It even taught me a number of things! Thanks a lot for the article, I'll be sure to use this the next time I need to analyze data.

Henrik Strandberg
profile image
Excellent article! I'd also be very curious to learn about your approach to and experience from QAing the data sets; after all, if you haven't verified (via QA) that the data is correctly generated, aggregated, transformed and exposed, how can you trust the analysis?

In my experience that's one of the main constraints in game play related feature selection; QAing thousands of data points is simply unrealistic.

Lukasz Twardowski
profile image
Henrik, that's a really good question. Most of initial instrumentations of analytics generate false numbers. One, because people who instrument analytics are not the same people who use analytics. They rarely understand how to collect data, what for and how to check its integrity. Two, because games are complex and usually have plenty of small glitches or shortcuts that might not impact players experience at all but can totally corrupt your data sets.

Developers who are aware that their analytics may produce false numbers trust data only if results are in line with their assumptions. That makes any analytics kind of pointless. Those who are not aware of that and trust their data usually end up making expensive mistakes. It's really way better to go blind without data and trust your team experience in both cases.

Of course there are ways to deal with that problem and get reliable results from analytics:

1. make sure that engineer instrumenting analytics service is working directly with someone experienced with that specific service - either someone in-house or a support guy from analytics vendor who will explain the process and audit the integration.

2. having a real time analytics during integration really helps as you can record your session and check results instantly. It's also important that you have ability to clean up database (or filter out your most recent activity) to make sure that you are checking the data from the last session only. If your analytics doesn't give you that comfort, log every outgoing data point on your side and do the math by yourself.

3. Even if you are very diligent about the integration, chances that you will get it right from the beginning are low. If you don't want to get into troubles due to data misinterpretation double check it using qualitative approach. Some of analytics services allows you to export data points by session id to excel and some others* have full set of features to analyze individual sessions or users. This will help you identify mistakes in data collection but also better understand correctly collected data before you jump into conclusions.

*I know only UseItBetter Analytics (disclosure: I'm co-founder) that does that for games but I might be totally wrong about it. Maybe Mixpanel or Playnomics?


none
 
Comment:
 




UBM Tech