CalTech x Activision Research: New Insights Into Toxicity in Games

September 22, 2025
Mark F.
(HE/HIM/HIS)

Gaming studios have known for a long time that toxicity can negatively impact player experience — but it also can impact your bottom line. Negative behavior drives players away, reduces engagement, and can quickly damage a studio’s reputation. That’s why the partnership between Caltech researchers, Activision, and Modulate’s ToxMod is so exciting: for the first time, large-scale academic studies are quantifying how toxicity spreads, how it impacts engagement, and what studios can do about it.

Two recent papers powered by ToxMod’s data shed light on both the causes and the costs of toxicity. Here’s what they found — and how studios can act on it.

Why ToxMod Data Matters

Both papers were powered by voice chat data from Call of Duty®: Modern Warfare® III, analyzed with ToxMod, Modulate’s voice moderation technology.

ToxMod doesn’t just transcribe speech — it detects emotion, volume, intent, and context to flag harmful behavior in real time. That depth enabled researchers to move beyond surface-level metrics and measure both the causal impact of toxicity on engagement and the effectiveness of smarter detection strategies.

Study 1: Smarter Detection With Bandit Algorithms

The first paper asked a practical question: how can studios detect toxicity efficiently, given that full-time monitoring is expensive?

Researchers tested a contextual bandit algorithm (LinUCB) — a machine learning approach originally used in recommendation systems — to decide which players to monitor. Unlike traditional systems that rely only on past player behavior, this algorithm factored in contextual signals like party size, skill level, and prior moderation history.

Key takeaways

  • Contextual bandit approach outperforms baselines: The bandit algorithm outperformed rule-based algorithms by up to 24.6 percentage points in detection accuracy (~51.5% relative improvement), depending on monitoring coverage.

  • By using contextual features, games can more accurately determine which players they should monitor: Features like skill mismatch, prior reports, and party status were statistically significant predictors of toxicity — more reliable than past behavior alone.

  • Strategic monitoring > universal monitoring: Returns diminish when studios try to monitor everyone. The biggest improvements came from monitoring strategically, not universally at moderate coverage.

  • Real-time adaptation works: Even in “cold start” situations (like new game modes or fresh launches), the algorithm learned quickly by exploring first, then refining. Daily updates were sufficient; more frequent retraining added cost without clear benefit.

Why it matters

Gaming studios can get the biggest bang for their buck when they allocate moderation resources more intelligently, reducing costs while catching more harmful behavior. Contextual algorithms help balance immediate detection with long-term learning, making them especially useful in the early stages of a game’s launch or in a new gameplay mode.

Study 2: How Toxicity Affects Players

The second paper turned to outcomes, asking: What actually happens when players are exposed to toxicity?

Using causal methods, researchers studied how toxicity influences short-term engagement and the spread of negative behavior.

Key takeaways

  • Toxicity reduces engagement: Players exposed to toxicity took significantly longer to join their next match. 
  • Toxicity is contagious: Players exposed to toxic voice chat were more likely to mirror that behavior and use toxic language themselves. 
  • Effects differ based on the source of toxicity: Opponent toxicity had the biggest impact on engagement, especially after losses. Same-party teammate toxicity was the strongest driver of contagion, but had less impact on rejoining delays.
  • Match outcome matters:  Losing amplified both harms — players delayed rejoining longer and were more likely to spread toxicity. Winning softened these effects, particularly against opponents.

Why it matters

The impact of toxicity isn’t uniform. If your goal is to keep players engaged, you’ll want to prioritize addressing opponent toxicity after losses. If your goal is to stop toxic behavior from spreading, focus on monitoring within parties.

What Studios Should Do Next

Together, these studies highlight two truths:

  1. Toxicity hurts your bottom line. Toxicity in competitive online video games has well documented and widely recognized detrimental effects, including reduced user engagement and potential harm to psychological well-being. It drives players away and fuels negative loops of behavior.
  2. Smarter tools can break the cycle. Contextual algorithms and real-time voice moderation like ToxMod give studios the precision to address toxicity ethically, efficiently, and at scale.

By adopting adaptive systems that respond differently depending on who is being toxic and when, studios can protect players while boosting engagement and retention.

Learn More About ToxMod

As this research reinforces, the right data and tools unlock new possibilities for understanding — and solving — the problem of toxicity.

ToxMod is already helping leading studios monitor voice chat in real time, detect nuanced forms of harassment, and keep players safe without overburdening moderation teams. Learn more about ToxMod and how it can help your studio scale trust and safety.