Modulate at GDC 2024: Leveraging AI to Foster Safer Gaming Communities

We were honored to attend the 2024 Game Developer Conference (GDC) in San Francisco, one of the premier annual events in the gaming industry, bringing together professionals from development, publishing, marketing, and technology. This year’s event featured hundreds of sessions, panel discussions, and networking events.

Modulate hosted several events throughout the week, including:

  • Gaming Safety Coalition Networking Night, where we welcomed player safety experts who are committed to forging a safer gaming universe.
  • I3D.net, RallyHere, Modulate Mixer, where we joined forces with i3D.net and RallyHere to offer an unparalleled chance to mingle with the gaming industry's elite in a laid-back setting. 
  • Modulate de-Tox Brunch, where we created a tranquil, alcohol-free zone for safety, community, and moderator professionals to unwind and connect after the whirlwind first few days of GDC. 
  • AWS Partner Spotlight, where Modulate CTO & Co-founder Carter Huffman contributed to a panel on Practical AI for Game Development with AWS thought leaders, game studios, and technology leaders. 

It was exciting to connect with the brightest minds in gaming, and we learned so much! Let’s distill some of the most important themes and takeaways from GDC 2024, from leveraging AI to foster safer gaming communities to balancing player safety and privacy.

Panel Discussion: Enabling Player Safety Without Compromising Privacy

Modulate CEO & Co-Founder, Mike Pappas, along with our COO, Terry Chen, contributed to a panel discussion about striking the balance between player safety and privacy. They were joined by Paul Nunn (GM of Kids Web Services at Epic Games), Lewis Ward (Research Director of Gaming and AR/VR at IDC), and Tess Lynch (Privacy & IP Associate at Premack Rogers P.C.), and Joe Newman (Senior Legal Counsel for K-id).

This in-depth panel discussion covered a lot of ground, from the ethical considerations surrounding data collection for content moderation to the specific challenges posed by minors and Personally Identifiable Information (PII). A few key takeaways included:

  • Data Collection Minimization: Explore what is the minimum amount of data you can collect in order to achieve a specific goal. 
  • Security and Threat Detection: Focus on collecting sufficient data to comprehensively understand and address potential threats.
  • Content Moderation Laws: Consider the balance between additional data requests (e.g., age verification and proof) and privacy risks, especially concerning children's protection and regulatory effectiveness.
  • Special Protections for Kids: Most agree that kids deserve special protections, but it’s important to define what those protections are and how to implement them, including age-based regulations like those required by the Children’s Online Privacy Protection Act (COPPA).
  • Global Reach: Games are global so it’s essential to tailor strategies based on regional differences in regulations and user demographics.
  • Technical Challenges: Recognize and plan for the technical challenges of managing a large player base, including concurrent user spikes, and invest in robust systems to support scalability and performance.

Panel Discussion: The Impact of A.I. on Community Health

Modulate CTO & co-founder Carter Huffman contributed to a panel discussion on how AI can be leveraged for better community moderation and safer gaming. Joined by Kim Kunes (VP of Gaming Trust & Safety at Microsoft), Dean Takahashi (Lead Writer at GamesBeat/VentureBeat), Justin Sousa (Head of Developer Community at Roblox), and James Gallagher (Head of Community Management at Keywords Studios), the panel explored:

  • Human Oversight: While AI can act as a first responder in moderation, final decisions should involve human review and the possibility of appeal. AI should be used for triage, not as the ultimate authority.
  • Transparency: Organizations should provide clear and transparent messaging about how they use AI in moderation. Instead of just stating "we use AI," they should explain what the AI tool does, how it is used, and how data is being processed or stored and for how long.
  • Automation: AI can automate moderation actions for clear violations, allowing human moderators to focus on higher-value tasks that require human judgment and intervention.

GDC 2024 Core Track Sessions

We also sat in on a few key Core Track talks throughout GDC this year, including one led by Modulate experts on insights that can be gleaned from analyzing voice moderation data.

Voice Chat Unlocked: Community Insights from a Novel Source

Our very own Director of Account Management, Mark F, and Senior Data Analyst, Liz W, walked us through how Call of Duty integrated ToxMod to improve their Player Reports. ToxMod found that 46,000 Player Reports were submitted each day. Of those, 25% didn’t correlate to an actionable offense. However, 79% of the severe offenses detected by ToxMod were not reported by players. Here are a few other key insights from Modulate’s research within Call of Duty and in other customer games:

  • Adults and Children Behave Differently: Understanding how different audiences play and interact can inform your Code of Conduct, moderation guidelines, and even game design.
  • Player Reports Don’t Cut It: Relying only on player reports misses a significant number Code of Conduct violations that could be reviewed and mitigated using proactive moderation.
  • Warnings and Mutes are a Great Start: Lighter-touch moderations (applying mutes instead of bans) increase understanding of and adherence to the Code of Conduct and allow players to keep enjoying the game.
  • Toxicity Directly Affects Engagement: Players who are exposed to toxicity are less likely to keep playing the game. When that exposure is reduced at scale with proactive moderation, players will keep coming back.

The ROI on T&S: The cost of toxic gamer cultures for players, studios and the bottom line

Rachel Kowert, Research Director at Take This, delivered an insightful presentation on the ROI of Trust & Safety (T&S) in gaming environments. Kowert led with several widely held assumptions about toxicity in gaming environments and dismantled each point with research from Take This and other gaming studies. The first assumption: that the internet doesn’t “really matter”. Second, that toxic players make up a significant portion of consumers, meaning that moderation could cut out your core base of players. Third, that toxicity only impacts a small number of gamers and isn’t worth investing in. 

However, research shows a very different story. Here are a few stats Kowert shared during her talk:

  • In a Take This research study, 90% of people surveyed report a negative impact on mental health due to toxicity in digital spaces.
  • 10% surveyed experience suicidal thoughts due to experiences in games.
  • Only 5% of gamers are bad actors, which isn't even close to the core demographic.
  • 75% of gamers say it’s important to feel safe in gaming environments. 
  • 70% said they avoid certain games due to bad reputations. 
  • 60% reported not spending money because of how they were treated by others in a multiplayer game.

So what does this mean for gaming companies? Kowert suggested:

  • Investing in community management.
  • Establishing strong guidelines and codes of conduct.
  • Adopting industry-wide standards.
  • Leveraging available tools like Fair Play Alliance's Digital Thriving Playbook.
  • Combining codes of conduct with effective moderation to reduce hate speech and improve player experiences.

AI-Assisted Player Support in Among Us VR Community

Laura Norwicke Hall, Senior Play Support Specialist at Schell Games, presented a fascinating session on how she and the Schell team built a player support initiative from the ground up, from sourcing tools like ToxMod and developing SusAdmin, to establishing ecosystem-wide philosophies and standards for player protection. 

While bringing Among Us to the VR space in 2022 posed several challenges, Schell developed a holistic strategy to keep players safe through in-game features, AI-based tools, and human moderation. Here are a few of the key lessons: 

  • In-Game Features: Schell wanted more than just a standard email support ticketing system. Instead, they integrated in-game features like report buttons, player muting, vote kicking, and age gating for a more proactive approach to moderation. In other words, build player safety tools directly into the game and give those players the power to better control their own play experience. 
  • AI-Based Tools: In April 2023, they launched moderation using SusAdmin, a custom-built Player Management System that allows moderators to take action against toxicity and review appeals based on reports from ToxMod. ToxMod reviews about 50,000 hours of in-game audio per month (which, without a tool like ToxMod, would require 300-350 full-time moderators to review) and then feeds claims to SusAdmin. 
  • Human Moderation: Schell invested in building a comprehensive knowledge base and training moderators with keyword recognition tools and player-facing FAQs to handle the volume of in-game content effectively.

While the ROI of moderation isn’t always immediate, Hall suggested implementing moderation early, doing ongoing research, and accepting that you won’t stop all toxicity, and that’s okay. 

GDC offered invaluable insights into fostering safer gaming communities, striking a balance between safety and privacy, and leveraging AI to benefit gamers and your bottom line. We were honored to be a part of this community and look forward to next year!