It is no secret that there has been a shift in monetization efforts for video games as the end of this console generation draws near. Where a game used to cost $60, and that was it, we have witnessed an ever increasing push toward delivery models such as â€˜Free to Playâ€™ (where the game is free to get in but the player is under pressure to pay for content as they progress), â€˜Episodicâ€™ (where a game is released as a series of smaller chunks that the player pays for individually), â€˜Downloadable Content or PDLCâ€™ (where extra content, such as extended story, and additional maps or modes, is sold to the player several weeks after the game ships), and â€˜Digital Rights Management or DRMâ€™ (where the player is required to enter a unique online passcode before the game can be played). There have even been efforts to minimize or completely do away with the playerâ€™s ability to sell and purchase used copies of their games, ultimately culminating in the XboxOneEighty debacle, where Microsoft attempted to block used games on their next-gen console, and were inevitably forced to change their policies due to consumer protest.
Also, apparently happening in parallel, there has been a perceptible rise in negative events within the industry itself. Stories about cancellations of sequels to beloved franchises and even full scale studio closures seem more and more commonplace in recent months.
Naturally, all of these efforts to seemingly squeeze more and more hard-earned dollars out of the gaming public, as well as doing away with fan favorites, have not exactly gone over so well (just ask Microsoftâ€™s Don Mattrick), but the industryâ€™s response has always been the same: that these monetization tactics are all necessary, and in fact, completely unavoidable, as the cost of modern video games has been steadily increasing, and it is the only way to ensure that gamers continue to receive the quality of games they have become accustomed to.
Then, not a few days ago, an article surfaced on examiner.com, by one (since let go) Alex Hinkley, titled â€˜The Problem The Gaming Industry Is That Developers Make Too Muchâ€™ , and though the piece is comprised purely of distorted facts, sweeping generalizations, and defamatory conjecture, Mr. Hinkleyâ€™s hypothesis (namely that it is because of the game developersâ€™ own overly inflated salaries that we are seeing a rise in so called â€˜anti-consumerâ€™ monetization tactics, and sequel cancellations or studio closures) is potentially very damaging to the already strained relationship between gamers and the gaming industry if it is believed.
It is because of my intense desire to protect, and hopefully even mend this relationship, that I feel I have an obligation to respond to these allegations today.
So, letâ€™s examine Hinkleyâ€™s claim then shall we? â€˜The problem with the gaming industry (aka recent monetization tactics and studio closures etc..) is that game developers make too muchâ€™. So, in order for this to be true, he would need to prove at least two things: First, that the developers do, in fact, make â€˜too muchâ€™ (whatever that means), and second, that these purportedly inflated salaries are the direct cause of the recent changes in monetization efforts.
First, why donâ€™t we begin with digging into whether game devs actually do make â€˜too muchâ€™, and then we can look into whether or not there is any correlation between that and the negative trends that are occurring within the industry.
Now letâ€™s pretend for a moment that we, Mr. Hinkley, or anyone other than the employer and the employee for that matter, have the right to decide what that employee ought to be paid for their job (which we clearly do not). How then would we arrive at this decision? Well the conventional wisdom would be to apply a series of criteria (education, experience, rarity of the skill set etc.) against the candidate, in order to assess their relative value, while also taking into consideration other more global factors such as standard market value for the role, cost of living in the area, and the organizationâ€™s ability to pay said salary. But, and this is only if we wanted to get really technical, we could always go with Mr. Hinkleyâ€™s method of simply comparing the average salary across all disciplines and levels of experience for the candidateâ€™s entire industry, against the overall average salaries of various other completely unrelated industries.
Now, of course Iâ€™m being sarcastic here, because that would be silly, but since Mr. Hinkley decided to take this approach, we might as well see if we can use it too.
Before we can compare average salaries, however, we must first examine the concept of an â€˜averageâ€™, or â€˜arithmetic meanâ€™. Wikipedia describes it as â€˜the sum of a collection of numbers divided by the number of numbers in the collectionâ€™, or again in English, you add up all of the different salaries and then divide the total sum by the number of salaries you added up. The description then goes on to state, however, that the â€˜averageâ€™ can often be an inaccurate representation of the â€˜middleâ€™ value in group, most notably for â€˜skewed distributions, such as the distribution of income for which a few people's incomes are substantially greater than most people'sâ€™. Simply put, the use of an â€˜average salaryâ€™ becomes misleading when one or more members of the group (say executivesÂ or those with equity in the companyÂ for instance) make a significantly higher pay than the rest of the group. This is because the handful of higher salaries will pull the average way up, until it is much higher than what the majority of the other salaries actually are.
With this in mind, letâ€™s take another look at Hinkleyâ€™s numbers. In his article, Hinkley compares the average game developer salary of $81,000 per year with the average annual salary for a police officer of just $51,000 (as well as to those of a teacher and CIA agent). He then goes on to ask â€˜Why are developers making so much?â€™
As it is presented, however, this juxtaposition is extremely misleading. The reason for this goes back to the problem with using averages for salaries, and how they can be dramatically skewed by high earners in a group.
For demonstrationâ€™s sake, letâ€™s use Hinkleyâ€™s police officer example. First of all, the police force is public sector employment. Their salaries are paid by our taxes, they have no CEOs or CFOs, and their salaries, whether they be a rookie or Chief, are all within the same ball park more or less. Game development studios and publishers, on the other hand, are private sector. They do have executive management and developer founders with equity, who make significantly higher salaries than the rest of the developers in their organization, and therefore, distort the average salary to a much higher number than what most, if not all, of the other employees make.
Secondly, police are a public service. That means that they are employed literally everywhere in the entire country. This is significant because not all cities are created equal when it comes to the cost of living, one of the most influential factors in determining salary. Of course some cities have a higher cost of living, and the police officers who are employed there would obviously get paid a higher salary, but one look at any cost of living map will show that the majority of the cities in America actually have a comparatively much lower cost of living, and thus the police officers that are employed there would naturally make a much lower salary, thereby skewing the overall average toward the lower end of the pay spectrum. Game developers, on the other hand, are concentrated in the larger, higher cost of living cities, and if you look at Gamedevmap.com (a geographical database of game development studios around the world) you will notice that it looks very similar to the cost of living map. Where the cost of living is highest in one map, there is the highest concentration of developers on the other. We can discuss the reasons behind this in another forum, but suffice it to say that thatâ€™s the way it is for now.
It would appear then that the only way to get a truly accurate comparison between the salary of a police officer, and that of a game developer, would be to exclude the salaries of executives from the game development average, and to also use a sample location where both game developers and police officers are in equal supply. Since Gamedevmap.com shows that the city of San Francisco has one of the highest concentrations of game developers for the whole country (roughly 175 studios), and it doubtlessly has a very high concentration of police officers too, then it would appear to be a fairly safe test bed for the sake of our comparisons going forward.
So what then does a police officer in San Francisco make? Well, according to the official SFPD website, the average entry level salary for a police officer is between $88,842 and a whopping $112,164 per year. Thatâ€™s right, the entry-level salary for a police officer in San Francisco is higher than the overall average salary for game developers!
But thereâ€™s moreâ€¦ Remember Hinkleyâ€™s number of $82,000? Well that was the average for all experience levels, disciplines, AND included executives. So in order for our comparison to be truly accurate, we need to look at the entry-level salary for game developers too. According to the Game Developer Survey of 2012 (the very same survey where Hinkleyâ€™s source, Game Developer Magazine, took their numbers), the average entry-level salary for a Programmer was $66,116, while new Artists made only $49,481.
So to summarize, most game developers actually make about half of what a police officer with the same experience and in the same geographical location would make. Thatâ€™s a essentially a complete reversal of the $81000 (game dev) to $51000 (police officer) comparison that Hinkleyâ€™s â€˜mathâ€™ produced.
Now, comparing game developerâ€™s salaries to other random professions is all well and good, but before we make a final decision on whether or not developers really do â€˜make too muchâ€™, maybe we ought to take a quick look at some of the criteria that actual employers consider when attempting to make the same decision? Well, according to salary.com the number one and two factors that influence what someone will be paid, apart from the employers ability to pay and the location that is, are the candidatesâ€™ skills/education and experience.
While I am certain no one can argue that being a police officer does not require a significant level of skill (in fact, a candidate must pass a rigorous physical evaluation AND written test before they are even admitted onto the force), and similarly, it is certain that the number of years a good police officer spends on the force will only increase their earning potential, we must, once again, look at the numbers in order to have a truly accurate picture of how things stack up.
According to policeone.com (an online guide on how to become a police officer), you must have a high school education and one year in a Police Academy, before you can become a police officer. They do recommend that you have additional education, like an Associate or Bachelorâ€™s degree, but it is not prerequisite. So as long as you have your high school diploma and pass the exams, you can be a police office. Furthermore, police officers are needed everywhere, which means that you are almost guaranteed to get a job as long as you graduate.
As for becoming a game developer, you actually do not need any education at all to get a job in the industry. This is because game development is a purely skill based industry, similar to being a professional athlete, actor, or musician. If you apply for a job, your skills and talents will be assessed against thousands of other applicants, and you will either be accepted or rejected. That said, having an education in such a highly competitive and technical field does have its advantages, as it will teach you the up-to-date skills needed to be successful, give companies more confidence in you during the hiring process, and most immigration processes require it if you want to work in another country (which is often necessary in the game industry). For example, one quick look through my contact list on LinkedIn shows that over 90% of my 342 game industry contacts have a higher education.
In addition to this, however, it is important to note that the gaming industry is a notoriously difficult one to break into. You need to have passion, talent, and (in most cases) an education to even be considered for a job. But even then, AAA blockbuster video games are massive, technical, and inexplicably complex undertakings that require literally hundreds of highly skilled artist, programmers, and designers to all work together as team in order to go off smoothly. For this reason, only the best of the best applicants are chosen, and the sad truth is that most people who try to get a job in the industry are not successful.
So where does that leave us? Well we know that both jobs are difficult and require a very specific set of skills to be successful. We know that neither police officers, nor game developers are required to have a higher education to find employment, though the statistics show that those who do have a much higher rate of success when applying for a job in game development. And we know that the success rate of finding a job is much higher for police officers as a rule than it is for game developers. But what does that tell us at the end of the day? It would seem to me all weâ€™ve learned is that both industries are skilled and challenging, and that both deserved to be paid decently for their services, right? And if anything, police officers are actually getting paid more than game developers anyway.
Either way though, based on the statistics that weâ€™ve seen from Game Developer Survey of 2012 and elsewhere, Iâ€™d say that a salary of somewhere between $49,000 (less than three years of experience), and $135,000 (10+ years Technical Director or Similar) is not completely unreasonable, and certainly not high enough for devs to be able to afford exotic sports cars or cause entire studios to be closed down. Wouldnâ€™t you agree?
So now that weâ€™ve discussed at length whether or not game developers salaries are really so high, and whether or not they â€˜deserveâ€™ the salaries that they are given, letâ€™s take a look at Mr Hinkleyâ€™s second claim that game developerâ€™s salaries are â€˜the problem with the gaming industryâ€™.
In order to prove (or disprove) a correlation between present monetization schemes, studio closures, and developer salaries, we need to go back and look at the industry explanation for the monetization schemes. In Hinkleyâ€™s own writing he states that â€˜nowadays they (publishers) are disappointed when games sell 3.4 million copies in just one month because the costs are so much higher now.â€™ So according to Hinkley, the reason for the changing landscape of the gaming industry is the ever increasing budgets of the games. Sounds reasonable enough.
It would appear then that all we need to do is discern whether or not it is the ever increasing salaries of developers that are responsible for the increase in the budgets of the games, and therefore the change in approach to monetization. Great!
For a timeline, let us use a sample period of between the year 2006 (the year the Playstation 3 launched, and long before the letters DRM, FTP, and DLC were even whispers in Publishers dreams), and 2011 (the year that such schemes really began to take traction). According to our now very familiar Game Developer Salary Survey, the average salary for an American game developer in 2006 was $75,039.00. That was, however, the average across all disciplines. If we break it down even further, it looks a little more like this: $37k for Quality Assurance, $65k for Artists and Animators, $80k for Software Engineers, and $95k for Business and Legal.
Now, the same survey taken for 2011 shows that the average salary for an American game developer in that year was $81,192.. the exact same number that Hinkley cited in his own article! Good for himÂ J But once again, if we break it down by discipline, it looks something like this: $48k for Quality Assurance, $76k for Artists and Animators, $93k for Software Engineers, and $102k for Business and Legal.
So thatâ€™s a raise of $6000.00 over the course of five years, on average, or roughly $1200.00 per year for the average game developer. If you do the math using the US Inflation Calculator, thatâ€™s not even enough to cover the rate of inflation. If in fact the average annual raise for game developers had been enough to cover the rate of inflation only, then the average salary should have been $86,827.00 by the year 2011, not the $81,192 cited by Hinkely.
So, if the developers' salaries are not increasing (at least not in any astronomical way), then does that mean that the game budgets are not increasing either? Well, according to wikia.com, the AAA blockbuster games: Gears of War, Assassinâ€™s Creed, and Crysis (all from 2006-2007) cost around 10, 20, and 22 million to make respectively. The very same chart, however, shows that the equally notable blockbusters : Crysis 3 and Skyrim (2011-2012) cost around $60 and $85 million to make. Thatâ€™s anywhere from 3 to 8 times as expensive to produce as their 2006 counterparts!
But, if the games really are getting more expensive to make, and the developers salaries are not increasing, then where is the extra expense coming from? Well letâ€™s do an experiment, shall we? Go get your copy of Uncharted, Assassinâ€™s Creed, or Elder Scrolls: Oblivion, put it in your console, and play it for a bit. Ok, now go get your copy of The Last of Us, Assassinâ€™s Creed 3, or Skyrim, and pop that sucker in your console and play it for a bit. Can you tell me where the extra expense is coming from?
Two words: Higher. Quality. With every year that passes, and with every new release, games are getting bigger, more mechanically complex, and better looking. Whatâ€™s more, they are getting this way because we, the hardcore gaming fans, expect.. nay demand it. Simply put, the games are becoming more expensive to produce because they are getting better.
So, if youâ€™re a Publisher, then what do you do? On one hand youâ€™ve got to deliver to the expectations of the fans in order for them to be happy and actually buy the game, and that means giving the projects bigger and bigger budgets, but on the other hand you also have to make the game profitable for your companyâ€™s shareholders, which means getting more money to make up for the bigger budgets. Well thereâ€™s really only three options, isnâ€™t there? You can either raise the price of the packaged product, which we (the fans) have made pretty clear we will not accept; You can decrease the quality or content of the game from we have become conditioned to expect, but weâ€™ve also made it fairly clear (through reviews and comments) that we will not accept that; Or you can figure out other ways of getting more money while also providing extra value in the process.
I donâ€™t know about you, but no matter whether Iâ€™m a publisher, gamer, or both, Iâ€™m fairly confident on which of the three options Iâ€™d prefer.
Which, at long last, brings me to my final point, and itâ€™s one that hinges entirely on you (and me).. the consumer. To put things bluntly, we live in a capitalist society, and at the end of the day, video games are a business, nothing more. So yes, publishers may have arrived at these new approaches to monetization because of the prohibitive nature of modern game development budgets, but they continue to apply them for one reason and one reason alone.. because we continue to pay for them.
It is a proven fact that DLC, Free to Play, and micro-transactions work. Mr Hinkley himself states â€˜Using micro-transactions can be a very profitable model. Valve recently announced they have sold over $10 million worth of hats on Team Fortress 2.â€™ So why on earth would a publisher (who is legally bound to increase profits for their shareholders) stop using a method that has proven to be so unbelievably profitable? They would kinda be the worst executives in the world at their job if they did. Wouldnâ€™t they?
The bottom line is this: Are some game executives and those rare few developers who have a stake in the company getting rich enough to drive fancy sports cars? Sure. But those few Cliff Blezinskis and Bobby Koticks are by all means the exception, and certainly not the rule. Furthermore, they are not getting rich because they are robbing the public of their hard earned money. They are getting rich because the public is telling them (with their wallets) that they are not only ok with it, but they actually want more.
Where game developers are concerned, on the other hand, by and large we are the gaming public. We make regular salaries, have families, and enjoy playing the same games as everyone else when we get home after a long day in the office. In fact, most of us are only in the gaming industry because we love games so much, and if we just wanted a fat paycheck, would probably all go work for Google or Zynga.
So if we, the gamers of the world, are unhappy with the status quo, then we need to get off of the forums and comments, and go out there and vote with our money. Letâ€™s tell the publishers what type of games we want, and exactly what types of content we will not pay for. And as for all the Alex Hinkleys that are out there, maybe if they tried to be part of the solution, instead of the problem, the developers and consumers alike would have an easier time realizing that we all just want the same thing: To play awesome games.