Gamasutra: The Art & Business of Making Gamesspacer
GDC: Riot Experimentally Investigates Online Toxicity
Printer-Friendly VersionPrinter-Friendly Version
View All     RSS
April 20, 2014
arrowPress Releases
April 20, 2014
PR Newswire
View All





If you enjoy reading this site, you might also want to check out these UBM TechWeb sites:


 
GDC: Riot Experimentally Investigates Online Toxicity
by Jim Cummings on 03/31/13 09:42:00 pm   Featured Blogs

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

 

This and other posts can be viewed at Motivate Play.

 

tribunal2Riot Games has gotten a fair amount of press in recent months regarding their empirically-based research on the nature of “toxic” player behavior in League of Legends.  As a result, I wasn’t surprised to find standing room only for Jeffrey Lin’s (Lead Designer of Social Systems for LoL) talk on research-informed measures for managing toxic behavior in online games.

Lin opened with the common notion that online gameplay has an inherently toxic element that must simply be accepted.  However, as this assumption is costly (players leave due to toxicity), Lin and his team – including several Ph.D.’s in fields ranging from cognitive neuroscience to human factors, who themselves are gamers – have sought to challenge it.  Working with designers, marketing, UI, and other production staff, the team of research specialists have conducted a series of empirical studies in-game investigating the nature of toxicity and features for potentially mitigating it.

Riot first constructed “behavior profiles” for individual players, examining the severity of toxic offenses across game sessions.  They found that severe toxicity in a given player is rare; rather, many games seem toxic because a single player is having a single, uncharacteristically “bad day”.  This lead the team to infer that toxicity may perpetuate through a ripple effect, as negativity fleetingly spreads from one player to others.

To investigate this idea the researchers conducted an experiment in which cross-team chat, as one of the main venues for negative interactions, was made optional for individual players.  And indeed, they observed a significant decrease in all measures of toxicity (offensive language, obscenity, and displays of negative attitudes).  Moreover, the total percentage of games using chat remained the same (only 46-47% included no chat, both before and after).  Lin and team therefore concluded that shielding players from toxic behavior can in fact prevent it from spreading.

Following this the team wondered if toxicity and attitudes about it could be changed by engaging players regarding their negative behavior.  This lead to the introduction of LoL’s Tribunal – a system by which the player community votes on whether a given player’s behavior should be sanctioned and the severity of any punishments (usually in terms of how many days a ban should last).  Lin noted that as of two weeks ago, the Tribunal has registered 105 million votes and, perhaps more impressive, has lead to 280,000 reformed players (those that have been punished previously but are currently in positive standing).  With regards to the “accuracy” of social sanctions, Lin also noted approximately 80% agreement between the community and Riot’s in-house team (with the team actually being the more severe of the two parties).

Lin then explained that, up to that point, players being punished with toxicity-related account bans in LoL were typically provided with vague notifications, messages that detailed the length of the ban but lacking real details on why the sanction was coming down on them.  To this end, the team conducted a third experiment in the series, to investigate if explicit feedback on previous behavior would increase reform rate.  All banned players were sent Tribunal reform cards, providing greater details on the player’s offense.  Not only did reports of toxic behavior decrease afterward, but forum posts showed that when offenders went to the forums and complained about how a particular behavior lead to a ban, the community generally agreed with the punishment.  What’s more, according to Lin, is that penalized players have written in to the moderators, apologizing for their behavior and asking for guidance on how to reform or prevent future transgressions.

slide2
Sample League of Legends forums comments shared by Dr. Jeffrey Lin of Riot Games. No copyright of slide contents claimed by Motivate Play.

Finally, Lin closed with a quick summary and partial report on one of the team’s most recent efforts, a study dubbed the “Optimus Experiment”. Reflecting on the psychological literature on priming effects, Lin and colleagues wondered if it might be possible to prime players so as to reduce toxic behavior.  To explain the concept to the non-academic audience, he noted that brief exposure to the color read can cause people to do relatively worse on an exam and that exposure to words related to the elderly can result in people walking more slowly (likely referring to this work on color association and Bargh’s renown experiments on stereotype priming).

The experiment was a 5 (information category) x 3 (color) x 4 (information location) factorial design, in which players received different types of game-related information in different game screens.  Category types included positive player behavior stats, negative player behavior stats, self-reflection notes, fun facts and a control (general gameplay tips).  The font colors for these messages included red, blue (thought to be associated with creativity) and white (control).  Message display location conditions included the loading screen, in-game, both, and none.

Lin briefly shared, for the first time, just a few of the results from the Optimus study.  One interesting finding was that showing a red message about negative behavior during the loading screen lead to a much larger decrease in toxic behaviors (in terms of attitude displays, abuse, and offensive language) than did the exact same message in a white font.  Additionally, showing a blue message about positive, cooperative behavior during the loading screen also lead to a decrease in a negative behavior, while no effect was observed for the same message in white.  And, interestingly, when the question “Who will be the most sportsmanlike?” (a positive behavior message) was presented in red, the toxic behavior metrics actually all increased.

Lin was quick to note that these studies are just the beginning, with a number a potential questions about the nature of toxic player behavior that could be examined. For instance, he briefly mentioned that the observed changes in behavior may be due to the spotlight effect, an assumption that we assume further research could later test more precisely.

Together, these results have lead Lin and colleagues to conclude that players are not innately toxic and that context is key for shaping behavior in online gameplay. To this end, he suggested to the other developers in the audience that it is their responsibility to help their players, to provide the information and mechanics necessary for removing oneself from negative behavior or bad choices rather than to simply remove offensive players from the game.  Altogether, I was quite  impressed with his team’s inclination to conduct these studies as well as the conclusions they took from the collective results.  It would seem that Riot, in turning to their scientists for answers, seeks to route toxicity by understanding and refining the player experience (bottom-up) as opposed to simply extracting offenders in full (top-down).  That is, user-centric, psychology-based design and policy rather than a blind “War on Toxicity” lacking nuance.  Indeed, in not only giving scientists a seat at the table but also relying on their insights for important development decisions, Riot is poising itself as one of the most well-informed designers of player experience, with likely long-term implications for both player retention and revenue.


Related Jobs

Penny Publications, LLC
Penny Publications, LLC — Norwalk, Connecticut, United States
[04.18.14]

Game Designer
Hasbro
Hasbro — Pawtucket, Rhode Island, United States
[04.18.14]

Sr. Designer/Producer, Integrated Play
Nexon America, Inc.
Nexon America, Inc. — El Segundo , California, United States
[04.17.14]

Web Designer - Temporary - 3 month
Darkside Game Studios
Darkside Game Studios — Sunrise, Florida, United States
[04.17.14]

Mid-Senior Graphics Programmer






Comments


Wes Jurica
profile image
I love this. It's a very interesting read. It is very nice to know that companies are using human behavior studies for the purposes of good (improving the player experience), rather than evil (applying Vegas-like gaming hooks).

Rafael Vazquez
profile image
Very interesting read, I can't wait to see further research published in this area.

Will Buck
profile image
I may be biased in my love for League of Legends, but I really do genuinely love what Riot is trying to do in combatting negative behavior. I think a lot of this can be applied to the internet as a whole as well, potentially paving the way to more civilized discourse over the web :)

Jorge Ramos
profile image
This is news to me. Especially since before I even considered the game, I automatically refused to even bother because the game already had such an infamously toxic environment because of the circle-jerk of everyone blaming everyone else.

Maybe if Microsoft implemented this with Live, I might have actually considered getting a subscription.

Michael Hartman
profile image
Great read and very interesting stuff.

But what it doesn't explain is why despite all these efforts, the toxicity of the LoL community seems to continually get worse, not better.

rodrigo chavez
profile image
I'm not a huge fan of this policy.

I think there are more problems in the long run with negative reinforcement. The nature of games like this means that even within your own team there will be unavoidable conflict - players will differ on what strategy to employ, they'll want to pick the same characters, they'll want to choose the same lane, etc. These conflicts don't have to be resolved in a toxic matter (though I'm sure they're the seed of many toxic games) but they do need to be resolved. In an environment where a ban hammer is hanging over your head, it's more in your favor to defer to the other players and stomach an avoidable loss rather than risk mob justice.

If you want to try something creative you run the risk of your four team mates striking it down, with the looming threat of them reporting you for "not being a team player" or one of the many other reasons that fit within the tribunal's nebulous criteria for bans.

Here's an example of what might be problematic: http://euw.leagueoflegends.com/board/attachment.php?attachmentid=
86542&d=1337730023

This player is banned for doing things such as randoming in a ranked match and choosing non-standard champion and item picks. Annoying? Yeah, I can see it being annoying if you think it's going to cause you to lose. Is it toxic? Warranting a ban? I really hope not. Diverging from a certain playstyle shouldn't justify having your account suspended.

Jess Groennebech
profile image
He could have had the same experience by joining a match that wasn't ranked instead he choose to ruin the game for 9 others players who wanted a serious experience. It sets a standard for others to behave in a similar fashion, I agree completely with the Tribute (and I don't even play LoL).

Granted there is some grey areas in what you're pointing at e.g. what if someone knows a decent strategy with a hero but fails again and again and people report him for that because they can't see what he is attempting and so on... but it's his job as a part of the team to explain and get a general consensus of what his function is on the team.

If he just want to lame around and try out new things then he is most certainly welcome to do matches that aren't ranked.

Joe Kinglake
profile image
Jess.. That seems like an awfully elitest attitude. I personally enjoy playing Starcraft as random, i find it adds that extra flavour to the game. I've not played league of legends but it seems like banning players because they enjoy the game slightly different from how it was designed to be enjoyed (which imo is a good thing) is also toxic.. I agree to some extent that there should be a matter of seriousness in ranked games however; You have to remember it still is a game and players shouldnt be banned for enjoying it.

Rob O
profile image
If someone is picking champs like that in ranked they ARE almost assured to lose. As Jess said this is toxic because it ruins the game for 9 other players. The 4 on your team that take a lose and the 5 on the others as it wasn't a real fight. No one likes a 4v5 and this is what this person continually did.

Jess also nailed he gray area comment. Take AP Trynd, just as crazy as AP Hecrim except it works. The difference is, the guy that made AP Trynd work perfected it then went into ranked and won 80% of his games with it thus making it acceptable.

The point of ranked play is to take the game seriously and do what you must to win. That is not what this person did. He would have been completely fine if it was normal and that is the important distinction to be made.

Joe, Jess' attitude is hardly elitist. League of Legends is nothing like Starcraft other then the top down view. In Starcraft you can totally random, there is even a rank for randoming. It works too. The difference is there are only 3 races in Starcraft and all 3 can be played in ANY situation. That is the difference. League has around 100 champions and they fit certain roles. You wouldn't queue up for the healer position in WoW as a Hunter, that wouldn't work, and if you tried to make it work your group wouldn't like you.

The problem is this case is the person is enjoying the game at the expense of 9 other players, that is why he should be punished. If you randomed in Starcraft that wouldn't be the case.

Chris Hekman
profile image
While I support the cause of having a non-hostile game enviroment, I think there is an important part that is being overlooked. Instead of jumping on to the next part, they should concider if the side-affects are worth it. On the EU-West server you would have a troll or "toxic" person once every 4 games. But now with the report system, i have read report spam chat nearly ever single game. It is the single most infuriating thing to have someone all chat "report x, because y" every time someone makes a mistake or bad call.

Bruno Xavier
profile image
"Riot Experimentally Investigates Online Toxicity"...

No. They don't.


none
 
Comment: