Toxmod logo

makes voice chat safe

Deployed in:
riot games logo
lucky vr logo
virtex stadium logo
ToxMod Offense Type Information

The only proactive voice chat moderation solution purpose-built for games

Built on advanced machine learning technology and designed with player safety and privacy in mind, ToxMod triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to quickly respond to each incident by supplying relevant and accurate context.

BOOK A DEMO
toxmod logo
Give us 30 minutes, and we'll prove ToxMod's ROI.

Voice-Native

ToxMod was born to understand all the nuance of voice. It goes beyond transcription to consider emotion, speech acts, listener responses, and much more.

INTELLIGENT

ToxMod becomes an expert in your game's code of conduct and escalates what matters most with high priority to your team.

SECURE

All user data is anonymized and protected to ISO 27001 standards. Modulate will not sell or rent your data, ever.

PLUG-AND-PLAY

ToxMod ships with a variety of plugins for different combinations of game engine and voice infrastructure. You can integrate in less than a day.

FLEXIBLE

ToxMod provides the reports. Your Trust & Safety team decides which action to take for each detected harm.

DETAILED

Review an annotated back-and-forth between all participants to understand what drew ToxMod’s attention and who instigated things.

Go beyond player reports and take the burden off your users

In contrast to reactive player reports, which rely on players to take action on reporting bad behavior, ToxMod is the only voice moderation solution in games today that enables studios to respond proactively to toxic behavior as it's happening, which prevents harm from escalating.

Youth playing an online game on a laptop
67%

A majority of multiplayer gamers (67%) say they would likely stop playing a multiplayer game if another player were exhibiting toxic behavior.

83%

5 out of 6 (83%) of adult gamers report facing toxicity online, across every demographic of player, though often with emphasis on targeting the underprivileged.

What is Proactive Moderation?

How ToxMod works

First, ToxMod triages voice chat data to determine which conversations warrant investigation and analysis.

  • Triaging is a crucial component of ToxMod’s efficiency and accuracy, flagging the most pertinent conversations for toxicity analysis and removing silence or unrelated background noise.‍
  • Unlike text moderation, processing voice data is labor intensive and often cost prohibitive, necessitating accurate and reliable filtering.

Second, ToxMod analyzes the the tone, context, and perceived intention of those filtered conversations using its advanced machine learning processes.

  • ToxMod’s powerful toxicity analysis assesses the tone, timbre, emotion, and context of a conversation to determine the type and severity of toxic behavior.
  • ToxMod is the only voice moderation tool built on advanced machine learning models that go beyond keyword matching to provide true understanding of each instance of toxicity.
  • ToxMod’s machine learning technology can understand emotion and nuance cues to help differentiate between friendly banter and genuine bad behavior.

Third, ToxMod escalates the voice chats deemed most toxic, and empowers moderators to efficiently take actions to mitigate bad behavior and build healthier communities.

  • ToxMod’s web console provides actionable, easy-to-understand information and context for each instance of toxic behavior.
  • Moderators and community teams can work more efficiently, allowing even small teams to manage and monitor millions of simultaneous chats.

Ready for enterprise

ToxMod provides best-in-class community health and protection services for AAA studios and indies alike

Enterprise-Grade Support

Modulate’s support team goes above and beyond the call of duty to support our customers. Modulate technical teams are available 24/7 to help address any critical issues in real-time.

Available in 18 Languages & Counting

ToxMod can distinguish real harms in multiple languages, and even keep track of context in mixed-language conversations.

Built-In Compliance Readiness

Modulate's team supports customers in writing clear Codes of Conduct, producing regular Transparency Reports, and conducting regular Risk Assessments.

Book a Demo

Learn how ToxMod can help you protect your community and empower your community and moderation teams.

A person playing a online game in group chat
trust and safety image

Sign up for our Trust & Safety Lately newsletter

A regular newsletter exploring the landscape of Trust & Safety in games, and explaining what the latest news and trends mean for studios.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.