ToxMod Levels Up Its AI Voice Chat Moderation To Take On Violent Radicalization in Online Gaming

Publish Date:
June 15, 2023

SOMERVILLE, MA / June 15, 2023 / Modulate, creators of purpose-built AI voice technology that improves the health and safety of online gaming communities, today announced that it has deployed the new Violent Radicalization detection category in its ToxMod voice chat moderation software. This groundbreaking detection category makes ToxMod the gaming industry's only voice moderation solution capable of identifying individuals promoting white supremacist and white nationalist radicalization and extremism in real-time, allowing community moderators to take immediate action.

In addition to the Violent Radicalization category, ToxMod continues to offer battle-tested detection of harms including Bullying, Racial & Cultural Hate Speech, Gender & Sexual Hate Speech, and more, helping game studios better moderate and address problematic player behaviors in their games.

The newly introduced Violent Radicalization category aims to address critical concerns within the gaming community by identifying and flagging the following behaviors:

  • Promotion - Sharing violent ideology to influence others.
  • Recruitment - Convincing individuals to join extremist groups or movements.
  • Targeted Grooming - Persuading vulnerable individuals, including children and teenagers, to participate in extremist activities.
  • Planning Violence - Actively plotting physical violence.

"We are committed to creating a safer and more inclusive gaming environment," said Mike Pappas, Chief Executive Officer and Co-founder at Modulate. "With the rise of extremism and radicalization in video games, our new Violent Radicalization detection category equips game studios with the necessary tools to combat the spread of extremist ideologies on their platforms."

The proliferation of white nationalist and white supremacist ideologies is well-known in the gaming industry. ToxMod's new Violent Radicalization detection category is the result of months of extensive research and collaboration with the Anti-Defamation League (ADL).

According to Daniel Kelley, Director of Strategy and Operations at the Center for Technology and Society at the ADL, "The spread of white supremacist extremist ideologies through online multiplayer games should be an area of grave concern for the games industry. Our 2022 survey found that one in five adults reported being exposed to white supremacist ideologies in online games. This represents a startling increase from just 8 percent who reported being exposed to white radicalism in 2021."

With customer feedback and data, Modulate will continue to expand and refine the terminologies and phrases associated with Violent Radicalization to ensure the highest level of accuracy and effectiveness.

About ToxMod

ToxMod is gaming's only proactive, voice-native moderation solution. Built on advanced machine learning technology and designed with player safety and privacy in mind, ToxMod triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to respond quickly to each incident by supplying relevant and accurate context. In contrast to reactive reporting systems, which rely on players to make the effort to report bad behavior, ToxMod is the only voice moderation solution in games today that enables studios to respond proactively to toxic behavior and prevent harm from escalating.

About Modulate

Modulate builds intelligent voice technology that combats online toxicity and elevates the health and safety of online communities. ToxMod, Modulate's proactive voice moderation platform, empowers community teams to make informed decisions and protect players from harassment, toxic behavior, or more insidious harms - without relying on ineffective player reports. ToxMod is the only voice-native moderation solution available today and has processed millions of hours of audio for AAA game studios, indie developers, and console platforms alike. Modulate's advanced machine learning frameworks have helped customers protect tens of millions of players against online toxicity to ensure safe and inclusive spaces for everyone.

Visit Modulate at https://www.modulate.ai to learn more, and follow Modulate on LinkedIn and Twitter.