Modulate’s FPA Pledge

Solving toxicity and disruptive behavior in gaming isn’t a problem that can be overcome by any single organization on their own. That’s why Modulate is proud to be a member of the Fair Play Alliance (FPA), a group of gaming-related companies committed to improving the experience of online gaming for all. As a culmination of deep discussions and genuine research, the FPA recently published a framework for how to approach these challenges, and issued a concurrent call to action - for all interested companies to create a list of actionable goals over the next six-months covering what we can do to improve our communities.

Modulate is in a bit of an unusual position for such a pledge - as a software vendor, more often than not we’re partnering with community leaders on their specific games, rather than deploying our technology independently into any given community. On the flip side, though, we are positioned extremely well to have a broad impact, since any solutions we devise can flow into a wide range of games quite quickly through our network of partnerships. So we’re excited to outline a few of the challenges described in the FPA’s framework, and some of the work we’ll be committing to in the coming months to tackle them.

Gaining the benefits of voice chat without the risks

The FPA’s framework states that today, voice chat is not recommended by default among strangers. This is a natural response to the risks of harassment, hate speech, and other forms of toxicity in voice, and one which has been repeated by a number of individuals from across the industry. Unfortunately, it’s not a viable long-term solution, because socialization is increasingly becoming too central to gaming.

Chris Priebe at Two Hat Security noted a few years ago that users participating in some form of social chat are more than 300% more likely to continue playing. Another study emphasizes that voice chat is uniquely empowering for many communities - nearly 80% of female players felt at least partially empowered by the opportunity to interact with others in games differently from how they would in the real world, even as 75% of those same players reported experiencing active harassment while playing. Additional research has found that voice chat creates closer bonds than text chat, builds stronger and more accurate feelings of empathy, and overall simply enhances the social experience online.

Most importantly, though, many players will find a way to use voice chat either way. Those who avoid voice will be at an unfair disadvantage, and, if the players are forced to chat on a third-party tool without moderation, then things will likely become even more dangerous and toxic.

Solving this problem is a core part of Modulate’s mission, and we’re already hard at work building the tools to empower more players to speak up and to detect and stop bad behavior. Over the coming months, we’ll be working with a variety of studios to deploy these tools into real communities and evaluate how player experiences improve.

Reshaping social dynamics

The FPA framework calls out a challenging question - where is the line, when it comes to defining harmful or toxic behavior? The reality is that there is no well-defined answer to discover here, because cultures are always changing. Modulate, like many other companies, deals with this challenge by keeping humans in the loop - in our case, we ask real community managers to weigh in on grey cases, which automatically provides feedback to train our AI systems. But we think the real solution will be to circumvent this question as much as possible. What if, instead of primarily reacting to behavior by judging it as good or bad, we could organize our games so that fewer players experience content they would deem harmful in the first place?

This might sound like a fanciful dream, but we believe it’s possible, and are hard at work building the tools to make it happen. The key insight, as the framework notes, is that most harmful behavior isn’t committed by someone who is fundamentally evil. They might be misinformed, or led astray by a bad mentor, or frustrated and coping poorly, or simply making a mistake, but the vast majority of online offenders would prefer to be getting along with folks. The issue is that a combination of cultural elements, emotions, and social pressures leads them astray, and that the current approach of punishing these bad actors tends to actually reinforce bad behavior more often than not.

Modulate’s ToxMod system provides the tools necessary to understand this sort of contextual nuance. Rather than trying to make predictions about people based only on their text chat, ToxMod gains insight into the way each player’s emotions evolve over the course of the game, providing a much more fine-tuned and actionable understanding of what sorts of experiences tend to trigger players to become disruptive.

Of course, we need to be immensely careful here, as we are all too aware of the ways AI systems tend to build biases when asked to make these kinds of predictions - especially given that a player’s voice conveys information about their demographics too. Given this risk, we’re approaching this work extremely slowly, and likely will never trust a black-box AI system to actually reshape player interactions entirely on its own. But over the next few months, we’re aiming to at least begin surfacing initial insights to individual community teams, who will then be able to use them as jumping-off points for further, human-guided exploration of their community dynamics. We hope that even this spark alone may lead to some powerful new ideas about how to create more inclusive spaces by design.

Defining what “good” looks like

As a third-party vendor, Modulate is uniquely positioned to analyze information from a variety of different communities and contribute to an overall assessment of industry health. Of course, we’re not going to ever share any private data from our partners without their explicit consent. But by applying our technology to purely public data - for instance, popular gaming streams - we’re able to collect quite a bit of helpful information about the behavior and language of folks from a wide range of communities. We’re still brainstorming here, and quite interested in feedback from the community, but we believe that aggregating this sort of information into even something as simple as a letter grade for different gaming communities could be a powerful tool to help players identify inclusive games and vote with their feet on the importance of solving these problems to the industry at large.

Modulate is thrilled to be a part of such a major industry shift, and we’re deeply excited and humbled by the substantial opportunity here to improve the lives of so many. If you have any feedback on how we’re thinking about enabling safer and more inclusive online communities, please don’t hesitate to reach out to us at ethics@modulate.ai .