ToxMod’s Powerful Voice Moderation is Now Available on Nintendo Switch

We’ve got some big news: as of today, Modulate is ready to arm studios developing for the Nintendo Switch system with the tools they need to effectively fight toxicity in games. ToxMod – our real-time anti-toxicity voice moderation platform – is now officially listed on the Nintendo Developer Portal, allowing easy implementation of our cutting edge anti-toxicity tools into Nintendo Switch games.

Why Fighting Toxicity is Important

Developers publishing on Nintendo Switch often strive to deliver highly entertaining, accessible, and family-friendly games — and game-savvy parents want to make sure their kids are safe. 

It’s easy to understand why —  83% of adult gamers report experiencing toxic online behavior across every player demographic. And anyone can be targeted. According to a survey from the Anti-Defamation League, 38% percent of women and 35% of LGBTQ+ players in the U.S. face targeted toxicity, and one-fourth to one-third of players who are African American (31%), Hispanic/Latinx (24%), or Asian-American (23%) face racial harassment online. Across the games Modulate works with today, we have found that just 6.2% of players are responsible for a whopping 91% of severe toxicity happening over voice chat.

It’s not just a duty or moral imperative for studios to make their communities more inclusive and less toxic – studios that are not proactively addressing these problems are facing regulatory challenges as new laws require studios to provide safe spaces for their players, with fines as high as hundreds of millions of dollars. 

We want to make gaming safer for everyone — developers, parents, and gamers alike. So how can we fight toxicity? By using technology to enable a proactive approach to moderation instead of a reactive one.

Why Fighting Toxicity Reactively Fails  

Unfortunately, most measures used to combat toxic players in voice chat enabled games today rely solely on reactive mechanisms:

  1. Someone says something awful.
  2. Victim reports the interaction through channels established by the company.
  3. For the small fraction of reports that are actionable, the moderation team investigates. 
  4. The bad actor is (sometimes) reprimanded or punished. 

This entire process only works after the damage is done — and its effectiveness is limited to reported cases. Many cases are unreported — hence the damage to the gaming community is probably occurring at a much larger scale than the industry can acknowledge. At best, the platform is only mitigating and limiting a small percentage of the damage — and only after damage has been done.  

And often, bad actors find ways around the punishments. 

Without significant, preventative mitigation, toxic behavior can lead to extremely unsafe and unfriendly game communities — and a serious attrition of players. 

Why Fighting Toxicity Proactively Works

Proactive voice chat moderation, in contrast with reactive moderation (player reports and voice transcription tools), is all about looking for signs of toxicity as it occurs across ALL in-game voice chat. Rather than relying on players to report bad behavior, proactive voice moderation notices when a conversation begins to take a dark turn and automatically captures the key data to either escalate to moderators or address autonomously, enabling studios to respond faster and more comprehensively to any unfolding toxicity. 

Proactive voice-native moderation identifies more than 30x the amount of toxicity than player reports alone. As gaming communities continue to grow, especially with the growth of virtual reality and the metaverse, a tech-enabled, proactive voice-native approach is the only way to keep game communities safe.

ToxMod empowers studios to make voice chat safe — enabling them to proactively identify the worst behaviors and address problems before they happen. This benefits both game developers and players:

  • It removes the onus on the players to report and police bad behavior.
  • It can help identify and flag behavior that is harder to identify and rarely reported, such as child predation.
  • It drives efficiency by reducing workloads for moderation teams.
  • It makes games and the game community more fun, positive, and accessible for everyone. 

The Modulate team is thrilled to bring ToxMod to studios developing for Nintendo Switch, enabling them to reduce toxicity and build positive voice chat experiences. With ToxMod on the Nintendo Developer Portal, studios of all sizes can integrate the industry’s only proactive, real-time anti-toxicity moderation tools into their games easily, efficiently, and effectively. We’re here to make voice chat safe. For everyone.