The global controversy over video game loot boxes is now over a year old, and shows no signs of abating. In fact, quite the opposite: consumer outrage continues to grow, and governments around the world are increasingly interested in regulating the use of loot boxes and other microtransactions. Several US states have proposed legislation to restrict the marketing and sale of games containing loot boxes, and although none of these proposals has come to fruition, other regions have been more successful in extending the visible hand of government over video games. Most notably, Belgium and the Netherlands have taken steps to eliminate loot boxes, and various other countries are launching investigations and public hearings to examine the problem.
But what exactly is the problem? In addition to the fairness issue (i.e. "pay to win"), the ability to use real-world currency to pay for randomized content also carries important psychological and regulatory implications, especially relating to addiction and gambling. It’s because of these problems that governments are considering various interventions. The gambling question is mainly a legal issue that varies depending on the game and on the region in which it’s sold, and there is no one-size-fits-all answer to it. Similarly, there is no serious empirical evidence yet to support the claim that loot boxes are addictive or psychologically equivalent to gambling. Yet sadly, public policy is rarely based on evidence, and the push for regulation continues, despite flimsy foundations. Gamers, for their part, seem perfectly happy encouraging government to take action against developers and publishers that incorporate loot boxes into their products.
What are needed most in this discussion are economic studies of microtransaction models, and of the arguments for regulation. I have taken some steps in this direction in a new paper that discusses how both regulators and the game industry itself have reacted to the loot box controversy (see here). Much more needs to be done to study microtransactions, but this is as good a place as any to start.
The main conclusion of my paper is that there are economic reasons for companies to rely more on revenue from DLC, skins, loot boxes, and so on, and furthermore, that there is evidence the industry has taken the loot box fiasco seriously and is changing in response. Although I do not make this point explicitly in the paper, one further implication is that government intervention is unnecessary, because market regulation is quicker, more effective, and doesn’t disrupt competition.
Contrary to popular belief, microtransactions are not a get-rich-quick scheme invented by greedy businesses: the reality is more complicated. In particular, microtransactions allow developers to overcome growing production costs. AAA game development is expensive, and costs are ballooning, especially due to factors like the increased need for persistent customer support, server upkeep, licensing and other intellectual property-related issues, and basic software and hardware expenses. At the same time, the market is saturated, gamers have many alternatives, and attention spans (and thus, the shelf-lives of games) are growing shorter. The upshot is that it’s far more difficult than it was a few years or decades ago for a AAA title to turn a profit based solely on its sticker price.
Microtransactions and loot boxes have emerged as potential solutions. I say “potential” because they’re just that: solutions that are still being tested in the market. This is exactly what entrepreneurs do, of course: experiment with different ways of doing business to see which ones do the best job of serving consumers. Although microtransactions are already quite lucrative for the industry, the backlash they’ve caused among consumers also shows they are quite costly. In other words, the market is already doing a good job of “punishing” the poor judgments of companies like EA that rely excessively on these revenue models. In fact, at the beginning of the controversy in November 2017, EA removed loot boxes from Star Wars: Battlefront II before it was officially released—about as early as possible, given the circumstances. The publisher’s latest offering, Battlefield V, employs a system in which real money can only be used for cosmetic content.
Other developers have faced similar problems as EA, and are experimenting with their own solutions. For example, in response to customer complaints, Middle Earth: Shadow of War and Quake Champions removed their loot boxes after release. Forza Motorsport 7 also removed loot boxes by allowing players to buy specific items directly in an in-game shop: the gimmick is that the shop’s inventory changes every few minutes, thus keeping an element of risk, but effectively transferring it away from the gamer. Other games like Forza Horizon 4 avoid the accusation of gambling by removing the ability to use real money to pay for the in-game currency used in microtransactions.
A variety of other responses have appeared as well. Apple, whose app store is a major distribution platform, has revised its terms and conditions to require games to disclose loot box odds up front so that players know exactly what they’re being offered. This is now a common tactic, and developers have independently revealed loot box odds for titles like Overwatch, FIFA 19, and Rocket League.
Self-regulatory bodies like the Entertainment Software Rating Board (ESRB) are also working to allay fears of exposing children to addictive or costly game mechanics. The ESRB has adjusted its ratings system so that games with microtransactions are now marked as containing “in-game purchases” so that parents have a clearer idea of what their children are playing. In Europe, the PEGI system has adopted a similar policy. The ESRB has also launched a new website to help educate parents about the meaning of its ratings system, and how parents can better safeguard their children. It’s notable that many of these changes are being made by people and organizations that do not agree loot boxes are psychologically dangerous, deceptive, or illegal gambling. Yet that’s the beauty of the market: consumers ultimately get what they want, and entrepreneurs either provide it or go bust.
Although the loot box controversy is ongoing, and may prove to be in its early days, changes throughout the past year indicate that the prospects for meaningful self-regulation are bright. After all, self-regulation is basically another way to talk about regulation by consumers, who still wield considerable influence in the industry, and are making their opinions heard (loudly).
However, the future of gaming is not entirely rosy. One of the more disturbing features of this controversy has been the willingness of gamers to use legislation to get what they want. Typically, gamers and developers have been united in opposition against regulation, as they were, for example, in the 1990s controversies about potential links between games and violence. In those debates, players were adamant that government not be allowed to censor content. Yet when it comes to microtransactions, gamers appear eager to demand regulatory solutions to what are little more than poor business decisions. This is a different and more dangerous kind of threat than the one posed by, say, over-zealous politicians using topical legislation to generate forgettable election-season soundbites for donors. Microtransactions are generating more persistent and serious interest, and some commentators believe this is a watershed moment when the game industry will be forced to accept regulatory oversight. One thing is certain: if consumers and regulators continue their joint venture against the industry, and if the industry fails to placate them, it won’t remain independent for long.