Tackle Toxicity Faster with ToxMod’s User Report Correlation

Multiplayer gaming has the unique ability to connect people across the globe all in the name of fun. And for the most part, it’s a wholesome way to connect with others with shared interests. More and more people are realizing this, with the percentage of players who now play with others increasing from 65% to 83% in just two years.  

But that connection has unintended consequences. A recent ADL report found that 77% of adult players had experienced some form of severe harassment. With a percentage that high, studios and their moderators struggle to keep up with reports, resulting in a toxic atmosphere that’s detrimental to both the players and the studio.

Modulate’s mission is to make online interactions safer and more inclusive. Using our expertise in artificial intelligence and machine learning, we developed ToxMod, the only voice-native proactive moderation solution for games and online platforms. Now, with ToxMod’s new User Report Correlation feature, moderators can combine the power of ToxMod’s toxicity detection and scoring features with in-game player reporting tools to gain full context, identify false reports, and fill in missing or erroneous information in real time

Reactive vs. proactive moderation

For a long time, moderators relied on reactive moderation – and some studios still think this is the only option. Reactive moderation leaves it up to the players to report toxic behavior and relies on them to provide accurate data and truthful reports. But reports can be one sided, and screenshots and audio clips sent by players can easily be manipulated or stripped of context or used to retaliate against other players.

With proactive voice moderation, there’s no need to wait for players to report toxic behavior. AI tools such as ToxMod can act as players, ingesting voice chats and producing a prioritized list of reports. Unlike any other tool available today, ToxMod is voice-native — meaning it goes well beyond mere text transcription to incorporate the nuance of tone, emotion, player reactions, and other important context clues.

Even with this next-level visibility, however, moderators couldn’t easily see the whole picture without manually shuffling through logs or switching between tools. As a result, the actions taken by moderators sometimes didn’t align with reality. Here’s a hypothetical example:

  • Player A is harassing Player B 
  • Player B responds with a toxic comment but doesn’t report Player A
  • Player A reports Player B for their response
  • The moderators ban Player B because of the report
  • Neither player is informed of the moderator action nor the reasoning

In this scenario, Player A was the instigator of the whole ordeal — but the moderators didn’t have that context, and only Player B was banned as a result. And, in many cases, they don’t even know why. That’s pretty unfair if you ask us! So what’s the solution?

User report correlation: Take proactive moderation one step further

With this new feature, moderators can get more than just a one-dimensional view of the situation— they can get a three-dimensional narrative. In other words, full context. Instead of trusting user reports and single event notifications alone, moderators can now automatically collect additional information such as what was (or wasn’t) said before and after the behavior occurred, as well as details about other players involved in that game session. With this info, moderators are more empowered to do several things: 

Identify false reports

It’s inevitable that some players will abuse the reporting system — whether that’s reporting a player they harassed first or getting several people to report someone they just don’t like. It’s unfortunate, but it’s a reality moderators have to deal with.

Our user correlation feature gives moderators visibility into what did happen…and also what didn’t. If nothing happened, or the report was clearly filed in retaliation against another player, moderators have the evidence needed to close the report quickly so they can spend more time on legitimate reports.

Prioritize incidents

A player calling someone stupid isn’t quite as bad as sexual harassment or hate speech, right? There will always be some toxic behavior that is more urgent than others. And without the ability to automatically tag reports based on urgency, reports of severe harassment can get lost in a slew of minor player reports. 

ToxMod’s filters already give moderators the ability to prioritize the most important toxic behavior detected in-game. Now, with user report correlation, they can also ensure context is taken into account when prioritizing reports filed by players. If a report is made by Player A about Player B calling them stupid, that might normally be marked as low priority. But with this additional visibility, the incident could be prioritized higher if additional context that shows Player A showcased more severe toxic behavior before they reported another player.

Fill in missing or incorrect information

A simple typo, a rushed report sent as the harassment was still happening, incomplete or vague reports — whether they were intentional or not, these issues can make it difficult to figure out exactly who and what is being reported and why. This can result in unaddressed reports due to a lack of vital information, or a significantly slower moderation process. User report correlation helps moderators narrow down when an event occurred and correct or fill in missing information.

The future of user report correlation

At Modulate, we’re constantly looking for new ways to make gaming safer. We’ve just launched user report correlation, but we’re already planning on evolving the feature to be more useful and effective for our users. Here’s what’s next:

  • Enhanced notifications for actions taken against toxic players. Many game studios have processes in place to alert players who have filed a report when an action has been taken as a result of that report. This will become even more important as upcoming regulations require this as a structured practice. However, we’ve found that there are a number of instances where a player has filed a report about toxic behavior, and ToxMod has already taken action to address that behavior. We’re building the bridge between these two events, so that a player who has filed a report will receive a notification that the content of their report has already been addressed by ToxMod and escalated to moderators.
  • Even better cross-informational intelligence. We’re making updates to the user interface and user experience to better surface player reports that correspond with toxic behavior that ToxMod has detected, so that moderators have a better view into each incident and can can avoid unnecessary context switching. We’ll be taking input from customers to ensure we’re using this additional context to improve the moderator experience.
  • Expanded incident context. We want to make it easier to identify what other toxicity may have occurred around a specific incident. For example, if one player is engaging in toxic behavior in reaction to another player doing the same, moderators will have an easier time identifying that these two players were part of the same game session and can adjust their moderation actions accordingly.
  • Optimized user report correlation to support a wider range of reporting systems. Right now, the feature can be integrated with the most popular reporting systems in use today, but we’re working to better enable correlation with every type and format of player report system out there. 

Take appropriate action with zero interaction

The increased visibility gives moderators what they need to mitigate toxic behavior without needing to interact with or request additional information from the players, making the whole process more efficient for both the moderators and the players. User report correlation also helps moderators take appropriate action by making it more difficult for players to abuse or manipulate the reporting system, and it gives moderators all the context they need to ban, suspend, or warn those at fault.

All these features help gaming studios everywhere work toward a common goal: taking action appropriately and efficiently to make their game a safer place.