Gamasutra is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Gamasutra: The Art & Business of Making Gamesspacer
View All     RSS
May 17, 2021
arrowPress Releases







If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 

Breaking the Virtual Ice: Toxic Behaviors in Social VRs and How Developers Cope with Them

by Yasaman Farazan on 02/16/21 11:00:00 am   Featured Blogs

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

 

This is a short version of my final thesis. You can find the full version here.

Introduction

Social VR games are one of the most commonly used applications on VR. In these online social environments, various players from different backgrounds can participate in casual conversations, creating content, or games. Though on the darker side, users could face behaviors that are not recommended by the developers or are not acceptable by the scales of real-world social norms. Social VR developers aim for a healthier and more welcoming place for as many players as possible.

Even though social VR has many characteristics in common with other social media services or digital games, it fosters unique interactions that stem from its immersive quality. Managing new behaviors emerging from the freedom that VR brings, poses many challenges for developers, both technically and design-wise. There is no doubt that they need to take action against toxicity. Since it is not only good for their community and publicity, it also has economic justification; Many players who have ever experienced any toxic behavior feel anxious to go back to that game. 

I tried to get more insight into social VR and types of toxic behaviors in these games and get to know various developers' solutions to these challenges. I prepared an online survey and analyzed the data to answer some of my questions. But before moving to the research, let’s go through some backgrounds on the topic.

Social VR

Social VR refers to online applications that focus on socializing in immersive virtual worlds. Besides socializing, it is possible to use these platforms to play games, create content, or role-play, among other things. VR Chat and Rec Room are two of the most famous social VR games. They are both available on almost all possible VR and non-VR platforms for free.  As you can see in the chart below, their concurrent players have been rising recently. 

VR Chat and Rec Room concurrent users based on Steam Charts

VR Chat was released in 2017, and it is still in early access. Based on Steam Spy, right now, it has more than 5 million owners. It has some advanced moderation tools that I haven’t seen in any other VR games so far.

VR Chat (2017)

Rec Room has been released in 2016 and has around 1 million users. It is more activity-focused and has many official and community-made games and rooms. Unlike VR Chat that doesn’t allow children under 13 to use the game, Rec Room has some regulations and settings that enable these users to play the game. Therefore it is more popular among the younger audience.

Rec Room (2016)

Toxic Behaviors and Violations

Virtual reality is an immersive technology and a more personal experience. It is believed that these features make VR a great platform for creating positive impacts by giving a strong sense of presence. When wearing a VR headset, the user almost detaches from the real world. In this setting, any abusive or discomforting act is also received more intensely. These behaviors could make it impossible to bear for many individuals and deprive them of the whole experience, while the virtual world has the potential to be an inclusive space for everyone.

Due to the emergent nature of these applications, players' interactions can be very unpredictable. This brings some community management challenges that are exclusive to VR. Even the common toxic behaviors are perceived differently in a VR setting. For instance, whispering a threat in an unsettling way or accompanying it with a gesture would leave a more intense effect on the victims. At the same time, the synchronous nature of social VR makes it harder to record these events or avoid the hostile situation.

Based on Oculus researches, toxic behaviors in Social VR can be roughly divided into three categories:

  1. Verbal harassment includes excessive swearing, explicit sexual language, cyber-bullying, inappropriate terms in usernames, threats to hack, hate speech, and violent speech. 
  2. Physical harassment consists of sexual harassment like touching someone in a sexual way, making sexual gestures, stalking, blocking others' movements, entering others' personal space, or passing through others' avatars. 
  3. Environmental harassment is done using virtual objects, showing offensive or violent content, or by using inappropriate avatars. In other words, they are usually intended to ruin others' experiences or exploit the system.

Other violations may not be considered harassment but are not acceptable in most Social VR applications, like children under 13 using Social VR without any supervision, or various forms of privacy violations. 

Regulation and Social Control Tools

Social platforms are ongoing products, and their existence depends on their community. When it comes to games, each developer is responsible for their community. They also hold the most powerful tools to control and lead their users toward better behaviors. More tools and options had been provided to change the outcome of a player’s unpleasant encounter. options like muting, kicking, reporting, or banning other users are well-known.

One more common tool in social VR is Personal Space Bubble that prevents users from getting too close to each other when it is not intended or becoming invisible when entering this range.
Some games also have a unique safe zone feature that allows players to quickly isolate themselves from the surrounding and mute all other avatars present in the room. This would put users in a space that will enable them to make further moves without any disturbance.  

VR Chat players can access some social control tools using this menu

One of the commonly used features in all games is contacting the game’s support. Facebook is planning to go further with its Social VR platform, called Horizon, by recording some footage to investigate the report in more detail. Oculus has also launched a security feature that lets users send video reports to their Safety Center. Some games offer support inside the virtual world, which means the users can directly contact a virtual community manager avatar to report their case.

Facebook Horizon Safety Overview Video

Another interesting tool is VR Chat's trust system. The trust and safety system is a ranking system to determine players' trust level, based on many variables. It is designed to shield users from annoying situations like loud sounds, visual noises, and other methods that someone may use to ruin others' experiences. Ranks are also displayed on users' nameplates. A rank called "Nuisance" is dedicated to users that have caused problems for others.

VRChat Trust System Menu

Each of these features works differently on each platform, and there is no standardized interaction for them on VR yet. This means users have to learn them on each game. Since using these tools interrupts the immersion, developers try to implement them as intuitively as possible. For instance, Rec Room allowed players to mute or block a user by holding their hand palm in front of them, instead of using a graphical user interface.

Even though social VR and cyber toxic behaviors are complicated subjects, I think we laid the groundwork for what comes next, the research.

The Research

I prepared an online survey using Google Forms with 24 questions in 6 sections. You can check the survey here. Some questions were mandatory and some optional. I included a short description of social VR but tried not to reveal the research goals. In the first part, there were some demographic questions, continued with general questions about the participants' experience in VR.

My online survey on Google Forms

I shared the survey on Reddit, my personal Twitter account, and on related Discord community servers. The survey got 106 responses. After removing the incomplete or spam answers I ended up with 96 responses.

The participants’ age ranged from 12 to 49, with an average of about 21. Many participants were 15 years old teenagers. Most data were from male players and from European countries and the United States.

Most people said that they use VR games every day or many times a week. As expected, almost all participants said that they use VR mostly for gaming and socializing. VR Chat and Rec Room were the most popular platforms. This was an expected result, considering their popularity and the fact that I mostly shared the survey on channels related to these two games.

Typical activities of participants in VR

Typical activities of participants in VR

Other survey questions revolve around the participants' social VR experiences and their thoughts and encounters with toxic behavior on these platforms. In the first question, I asked for an overall rating for their social VR experience, and most participants rated it as being very positive. Then I asked them to describe what they would change in VR if they could change only one thing. Among all responses, the most common theme was toxic behavior and requests for fewer children using the platform. 

To better understand toxicity frequency, participants were asked to report which items they have witnessed in a VR setting. Most of them said that they had encountered one or many of the mentioned items.

Experienced or witnessed behaviors in social VR
Excessive swearing, children using content that are intended for adults, making a sexual gesture, passing through others' avatar, touching someone sexually, hate speech, entering others personal space, and inappropriate terms in usernames were the most commonly reported cases that more than half of the participants had experienced.

Another analysis of this data shows the gap between different genders, as you can see in the visualized data above. This difference is more visible with stalking, entering others' personal space, hate speech, display of offensive content, privacy violations, and violent Content.

In the next question, participants rated behaviors mentioned previously based on how inappropriate they find them in a virtual world. In total, physical harassment, like entering other’s personal space, got the least ratings compared to the other two forms of harassment. Female participants rated harassments like touching someone and making sexual gestures as intolerable and found entering personal space more inappropriate than male participants.

Verbal harassments like excessive swearing and inappropriate terms got the least ratings. They were considered tolerable, except for hate speech, which is the item that is considered to be the most extreme among all. 

The environmental harassments, like displaying offensive or violent content, were also common concerns among the participants. Other violations like privacy violation and children using Social VR were both among the actions that rated more inappropriate.

How inappropriate they find each item in Social VR
Based on the survey, two social control tools, muting, and reporting were very well known and used by the participants. This is not a surprise, considering the frequency of verbal harassment that happens in these social spaces. Two of the least recognized and used tools were about the personal safe zone and voting to suspend users. This may be due to the survey's wording and lack of common terms compared to more conventional tools.
This finding confirms that most players are aware of the available tools to protect themselves from unsettling situations. And overall, participants were satisfied with the VR developers' performance regarding preventing or controlling toxic behaviors.


 Knowledge and usage of social control tools

Limitations and Conclusion

Before reaching the final conclusion, I would like to mention some limitations of my research. First, though I tried to gather answers from various places, the sample group was recruited through channels that were more populated by dedicated players. Even though this could be seen as a merit of the research, it may also mean that they could be more biased.

The next limitation comes from the games. VRChat and Rec Room are both very popular Social VR games, but they are also very different in the overall experience, community moderation, and players’ culture. Also, the gender ratio was unbalanced, which make the gender-related results less valid. In total, since the survey was conducted entirely online, it was prone to mistakes and misunderstanding by participants. Furthermore, my study did not distinguish between private and public virtual venues or between received and witnessed toxicity. 

To sum up, I found that many players encounter various forms of harassment in their social VR experience and female players are more targeted.  I find out about the various forms of antisocial behaviors that are feasible in this context that may not be achievable in other environments. 
I realized that players found physical harassment less irritating than verbal harassment. This result may be due to other factors like differences in participants' experiences or research's limitations. I realized that many players decide to leave a lobby or a game whenever they encounter any abusive behavior. 

In the end, as one of the participants pointed out, these platforms have their subcultures, and each subculture or group has its own implicit rules and social norms that may not be acceptable to others. But this usually gets more complicated in public hubs since people from different cultures and backgrounds may find fundamentally different situations more offensive than others. 


Thank you for reading my summary. If you are interested in the study you can check my thesis here. I’d be very interested to hear your comments or answer your questions.

 

Main References and Further Read

VRCHat Terms of Service

VRChat Safety and Trust System

RecRoom Junior Accounts

Rec Room Comfort and Moderation for VR Players

Fostering a positive community in VR

Too real: Fighting sexual harassment, abuse and violence in virtual worlds

If Virtual Reality Is Reality, Virtual Abuse Is Just Abuse.

Online Harassment 2017

Harassment in Social Virtual Reality: Challenges for Platform Governance

It Is Complicated: Interacting with Children in Social Virtual Reality


Related Jobs

Sucker Punch Productions
Sucker Punch Productions — Bellevue, Washington, United States
[05.14.21]

Senior Systems Designer
Sucker Punch Productions
Sucker Punch Productions — Bellevue, Washington, United States
[05.14.21]

Senior Games Writer
Sucker Punch Productions
Sucker Punch Productions — Bellevue, Washington, United States
[05.14.21]

Combat Designer





Loading Comments

loader image