How To Build Safer Spaces in VR Games

In just a decade, we’ve seen how old articles from VR naysayers have started to read a little like the 18th century arguments against reading books. Between the unexpected push toward a digital workforce thanks to COVID and the hot competition between tech giants such as Meta and Apple, our journey to a virtual reality seems to have accelerated tenfold. And while VR tech has been growing for years, it hasn’t always been a justifiable purchase.

Now, it is. Well…more justifiable, anyway. Not everyone can afford the $3,000+ price tag of Apple’s new Vision Pro headset, but with all the Vision Pro can do compared to older VR systems, it may be worth the cost to many. Sales from VR apps are proof that more people are willing to spend on VR: in February, Meta reported 200 apps on its VR devices had generated $1 million each.

For the gaming community, this rise in VR has created a whole new type of gaming experience. And this, of course, comes with a whole new set of moderation challenges — and a need for more effective moderation solutions. Let’s go over some of our tips and best practices for creating resilient and inclusive VR spaces, including: 

  • The unique challenges of moderating VR games voice chats
  • How to make effective and enforceable player codes of conduct
  • Guidelines for trust & safety teams
  • How to execute compliance-centered design

VR games: a new playing field for bad actors

One of the coolest parts of VR gaming is getting to interact with other players as if you’re really together. You’re not typing messages to each other if you were gaming in the same room — you’re talking!

With VR games, players use voice chat as their primary mode of communication. Unfortunately, it’s inevitable that bad actors will find ways to abuse voice chat — and that kind of toxicity can have a far more devastating effect than other types of online harassment.

“Even the common toxic behaviors are perceived differently in a VR setting,” says Yasaman Farazan, a blogger in the gaming space. “For instance, whispering a threat in an unsettling way or accompanying it with a gesture would leave a more intense effect on the victims. At the same time, the synchronous nature of social VR makes it harder to record these events or avoid the hostile situation.”

How do we build safer spaces in VR gaming?

These best practices will help you build a safer VR gaming experience for everyone involved, from the players on your platform to the moderation team reviewing reports of toxicity

Make Reports Easy to Submit and Efficient to Review

Reporting functionality has been essential to creating safe gaming spaces for years. Now, with the rise of VR, it’s not just best practice — it’s necessary. In fact, Meta Quest now requires all VR games to have a User Report function to make VR gaming safer for everyone. Other systems are certain to follow suit.

Moderation tools like ToxMod can help meet this requirement with its User Report Correlation feature: moderators can combine the power of ToxMod’s proactive toxicity detection and scoring features with in-game player reporting tools to gain full context, better identify false reports, and fill in missing or erroneous information in real time.

Proximity Chat, while useful in many types of games, is also especially useful in VR games. Virtual reality settings often have big worlds or lobbies where players interact. This ToxMod feature can help you identify where the toxicity is coming from and (more importantly) who is affected to make more-informed moderation decisions.

Trust & Safety Starts and Ends with a Strong Code of Conduct

Reporting functionality is only truly useful if you have a Code of Conduct to guide reports. 

Taking a step back and looking at it from another context, let’s say you’re teaching kindergarten and you’re welcoming your students into the classroom for their first day of class. One of the first things you’d do is clearly explain what the rules and expectations for each of your students is. That way they can’t claim “I didn’t know it was wrong!” when they steal someone else’s crayon.

So when it comes to the gaming space, not having a player Code of Conduct that’s concise, clear, and accessible will make it hard to address harassment, toxic behavior, and violent radicalization. With Code of Conduct publicly available and already in place, players won’t be able to fall back on “I didn’t know it was wrong!” as an excuse.

To be clear, creating a Code of Conduct won’t stop all bad actors, but it will help your community know what is and isn’t acceptable (and empower them to report toxic behavior).

Many gamers are just now diving into VR and are experiencing toxicity and harassment in a whole new way. It may not be clear to them what is and is not reportable. A Code of Conduct is especially important for VR games if you want your players to feel they have the documentation and support to protect themselves.

Three things to keep in mind as you develop (or revise) your Code of Conduct:

  1. Use clear, concise, and plain language. This will help avoid confusion and make it harder for bad actors to find loopholes.
  2. Keep it easily accessible on both your website and in-game. That way, if a player experiences questionable behavior, they can quickly find the code and submit a report so you can make voice chat moderation decisions sooner.
  3. Make your team available. Let’s say someone has made a player uncomfortable, but they’re not sure if the behavior actually goes against the Code of Conduct, and they don’t want to be penalized for a false report. Giving players an avenue to seek clarification will help well-meaning players avoid false accusations — either as the reporter or the one reported.

Moderation Guidelines for Trust & Safety Teams

Without strong guidelines, every moderation decision becomes subjective, and reports start to pile up quickly. Even worse still, nonexistent or vague guidelines leave your team guessing, asking questions like:

Is this word considered hate speech?

Is this truly abusive language or innocent competitive banter?

Is this violent radicalization or an exercise of free speech?

This lack of specificity will inevitably lead to two things:

  1. Moderation decisions will take longer as your team tries to sort out if a report is legitimate
  2. Toxic behavior will go unreported as a result of subjective decisions or moderator bias

To make sure the right disciplinary action is taken (and taken quickly), you need clear and concise guidelines that show your team exactly what is and isn’t acceptable.

This clarity shouldn’t equate to static, however. Language and terminology changes quickly, especially as bad actors find new words and phrases to avoid being reported. The demographics of your community  will also change — a newer game likely has a smaller (and often kinder) player base that is committed to keeping the game safe. The more your game grows, the more likely it’ll become a hotbed for bad actors.

Revising your moderation guidelines 1–2 times per year (and running regular trainings for your moderation team) will help you keep pace with the growing number of bad actors and their increasingly innovative ways of disrupting the experience of your invested and involved community.

As you grow, your moderation team will come into contact with more and more toxic behavior — and this can be especially taxing in VR voice chat moderation. This level of exposure can cause compassion fatigue and burnout, so it’s important to spotlight resources to support mental health such as this guide from our friends at  Take This or proprietary onboarding resources.

Build Trust with Transparency at Every Level

Game updates and new features can take time, so many game studios use their blog, social media accounts, or a Discord server to share progress. When players know what is being actively worked on — whether it’s a bug fix or a new feature — they are often more understanding of the time these things take.

That level of transparency is also useful when it comes to game safety and moderation. Here are some ways you can increase transparency:

Clarify the reporting process. Players become understandably frustrated when they hear nothing back from the studio after submitting reports of toxicity and disruptive behavior. Keep your players notified about which stage of the review process their report is in, even if it’s an in-game notification or email. These incremental updates demonstrate to players that you’re taking their concerns seriously, even if the entire process takes longer than they’d like.

Make your moderation actions known. Don’t just share game updates or feature releases with your users - share community safety updates as well. Sharing updates about the amount of reports received and number of offenses addressed will help you build trust with your players (and celebrate the efforts of your Trust & Safety team). 

Involve your community. Be proactive in soliciting the input of your players. Collecting feedback through quick surveys can point you to areas that need changes in the game while showing that you aren’t making decisions in a vacuum.. This feedback is especially important as game devs learn how to best support and protect players in VR.

Not an Afterthought: Compliance-centered design

As VR becomes more popular, we have to protect the most vulnerable population in the VR world: children. And with Meta lowering its minimum age of Meta Quest users from 13 to 10, the need for effective moderation solutions is increasingly urgent. 

While Meta is implementing restrictions (like parent-controlled accounts for kids 10-12) and removing ads for this age group to create an “age-appropriate experience,” this won’t stop bad actors from finding new ways to target children.

The result will inevitably be stricter compliance regulations from Meta and other regulators. Start preparing for these regulations early on so you can avoid legal battles or fines and create a safer space — especially for kids. Moderation solutions must be configurable and compliant by design. Make sure the moderation solution company you’re working with can tell you exactly how their tool will help you keep children on your platform safe. Beyond the moral imperative to protect children in online spaces, it’s not worth the risk of regulatory fines or public backlash. 

A safer VR experience for players and moderators alike

VR gaming is growing fast, which means studios, devs, and moderation teams have to work hard to keep up — but maybe not quite as hard as you think. ToxMod gives devs and moderation teams the tools you need to take decisive action quickly and accurately so you can create a safe (and compliant) VR gaming experience.