It's 2015. Every game developer has heard about how important analytics and big data are. So everyone's rushing into it; VCs fund new vendors such as Amplitude and Omniata; you are rushing to try out Amazon Redshift. This blog post is about what to watch out for in analytics projects. The points below are my personal opinions and are based on 6 years of Sonamine experiences deploying advanced predictive scoring, combined with 8 years at MicroStrategy, a BI tool used by the likes of EA and Activision.
Expecting too much from analytics (aka drinking the coolade)
There is a lot of marketing content about analytics out there, most of it put out by vendors and industry analysts. Many different case studies show how conversion rates can be increased or CPIs reduced. But in the aggregate, how much can analytics help your bottom line? My experience is that in the general, under best case scenarios, analytics can help improve revenues or costs by about 25%. Yes there will always be some outliers; but this is the overall general baseline.
About that 3x higher conversion rate study, read the fine print and see if it mentions the size of that campaign. Most likely it is comparing a campaign with a small target audience (eg. 10%) with the overall conversion rate of the entire user base. 3x higher conversion rate for 10% of users only gets you a 20% improvement overall. (See note 1 for calculation details)
The right-sized expectation matters in how you budget for analytics initiatives. As always, it is good advice to under-promise and over-deliver.
Ignoring the usage aspect (aka build it and they will come)
Analytics and big data refer to many things, but in general, you have to collect the data, consolidate the data and generate some insights from the data. But insights are useless unless they are actually used to change something. Let's say that the purchase funnel report clearly identifies a step that is broken. If there's no interest or capacity to make changes to the purchase funnel then the insight is useless.
There are innumerable predictive models or powerpoints of charts sitting idle because they will never be used. One main reason is that the teams that have to use the insights are usually not the ones initiating the analytics projects. Many times, I have seen the "central analyst team" discover an incredible insight, only to find that the rest of the organization has other goals and priorities. Analysts have difficulty "selling the idea". The game design team is not asking for that awesome multi-dimensional player clustering because they have 3 specific player profiles they are designing for. The CRM team is not asking for predictive scores of likely converters because they can hit every player with spam messages. Management is not interested in cutting CPI cost by 10%; they would rather invest in a new game to double their revenues.
Analytics is inherently a multi-department effort if you want to reap the rewards. Involve the "other" department early (read: before you hire people or contract with vendors) and be willing to change your analytics projects to match their priorities.
Being distracted by analytics (aka overdoing it)
Analytics gives you insights to tweak and optimize something that is already working. For a game developer or publisher, this means a game with decent player traction; established marketing processes to acquire, retain and monetize players; customer services and community management to keep players happy, enough finances to keep going. Just getting a game released is a top priority, not to mention bug fixes and content updates. Management time is a precious non-renewable resource so it should be prioritized.
Analytics projects unfortunately are like the hydra monster, spawning new heads when you think you are done with it. This is partly due to the rapid introduction of new tools and vendors. Many of Sonamine customers who were using Vertica just 2 years ago are now migrating to Amazon Redshift. There is immense temptation to adopt new tools, which are usually "free" to try and promise better insights, in "real time"! Alas, your management attention is not free and new tools add risk to your analytics projects.
It's ok to live with basic performance metrics and get smarter once the mission critical operations are in place.
Tricked by ROI (aka focusing on the wrong priorities)
Return on Investment is the measure of whether analytics provide more return than they cost. Return is usually measured by additional revenue or lowered costs; Analytics costs usually include the software, hardware and labor costs. A positive ROI means the analytics project generates more value than the total cost of analytics. This is taken as a good sign and a 10x ROI is amazing.
However, there are 2 problems to keep in mind on the ROI. The first is that management has to make sure that the added value is meaningful. A highly positive ROI project that yields only a 0.5% improvement in overall revenue or reduction in costs will do little for the company. The second is that ROI calculations like these do not take into account the opportunity cost. Was there something else of greater value that should have occupied your time instead of an A-B testing tool?
Be mindful that analytics is not taking the place of something else more important.
Before you start an analytics project
Some simple things to do to increase the odds of success are :
Note 1: Assume 1% conversion rate. From 1000 users, you get 10 conversions. With targeted campaign to 10% of users and 3x conversion rate, you get 3 conversions from the 100 users and 9 from the rest of the user base, ie. 12.
Note 2: Other than Sonamine, which I founded, I have no professional or financial affiliations with the vendors mentioned in this post.