The People’s Metaverse

The hottest topic in tech today is the burgeoning “metaverse.” The exact meaning of this word is widely debated, and each provider brings with them their own interpretation. As a partner to many of those leading platforms, such as the social VR platform Rec Room, Modulate has had a front-row view of these discussions. And while the specifics certainly vary, there’s one high-level trend that emerges again and again.

The metaverse means being able to build genuine and meaningful relationships just as easily online as in the physical world.

Players want social connection and community engagement. They want friendly rivalries and opportunities to grow as people. They want support and inclusivity and acceptance. Games today can offer these experiences in fits and starts, but by and large they fall short of what players are truly looking for. Despite the efforts of many platforms to prevent it, toxicity - hate, harassment, trolling, grooming, and other malevolent or disruptive behaviors - seeps into far too many of these interactions and prevents these “proto-metaverses” from reaching their full potential.

This need shows up in statistics as well:

  • That same ADL study also found that the #1 thing players wanted was for platforms to take more responsibility for protecting and supporting their players.
  • Modulate’s own surveys found that 50% of players marked voice chat moderation - as the channel where the most emotive and vulnerable conversations typically occur - as one of the most urgent priorities for any online platform; and that 75% of players would not just accept, but celebrate, any new efforts to solve toxicity there.

And this is just what users say explicitly! Looking beyond surveys, we see similarly potent takeaways.

  • The ADL’s 2019 report noted that only 8% of users typically report harassment they experience.
  • Yet more studies find similarly potent findings:
  • We could go on and on, but we suspect you get the point.

Sure, you may say, toxicity is a real problem. Players are demanding solutions. But those solutions usually mean some kind of moderation - which means taking away autonomy from those same end users, in a way they might push back against. Do users actually respond well in practice, when these tools are deployed?

At Modulate, we have quite a bit of experience with this, having launched our voice moderation solution ToxMod in a number of popular titles to date. In our recent launch with Rec Room, the Rec Room team did a great job notifying their users about the change - both flagging it in advance and providing deeper context as the initial deployment went live. Even so, we wondered - how would Rec Room’s community react when they actually saw the notice in-game, informing them that voice moderation was active?

The response was overwhelmingly positive. Of course, as one might expect, there were questions at first:

  • I enjoy trash talking with my friends - am I going to get banned?   (No - ToxMod helps studios enforce their Code of Conduct more efficiently and comprehensively, and harmless joking among friends is fine with Rec Room - and most studios!)
  • Will my data be resold or used outside of the moderation process?   (No - ToxMod only saves and escalates audio when it detects toxicity. Normal conversations aren’t retained, and in either case the data is only used for safety, and is never resold or used for other purposes.)

But the vast majority of comments underscored exactly the need for real solutions to toxicity (comments are tweaked for clarity and to avoid any identifying info):

  • I got a VR setup recently so I could play with my kid. I was shocked by what I heard when I first logged on, and was never brave enough after that to re-enter the Rec Centers [public areas.] I’m so excited that Rec Room is working to make these spaces safer!
  • I love this! A while ago, I bought a girl outfit (I’m a boy), and someone immediately got in my face and was calling me a “f**.” I haven’t played ever since, but now I’m excited to go back and feel safe vibing as myself.
  • As a bi and trans player, I’m so fed up with being called homophobic and transphobic words just when I’m trying to have a good time. I’m a big fan of this idea.

Not only did players endorse the idea, but many took it upon themselves to become broader advocates. Many of them researched our website and began responding to others’ questions, clarifying ToxMod’s approach to assuage fears. Some influencers, like Tarapeutic here, even created rich videos explaining how ToxMod worked and encouraging users to share their reactions.

By taking steps to fight toxicity, Rec Room hadn’t just made their platform safer for players. They’d also clearly demonstrated to their users that they were - and remain - committed to building the kind of metaverse that players want. And the community hasn’t been shy about their appreciation for that effort.

. . .

In the metaverse, user-generated content means that experiences aren’t differentiated - players have already recreated Ocarina of Time in Minecraft and fully-functional Pokemon games in Roblox.

Virtual asset technologies - whether built on Web3 infrastructure or more traditional tools - and virtual avatar tools mean that your identity and resources will soon be carried from platform to platform, freeing up players to explore spaces without fear of losing those digital assets.

What will differentiate these metaverse options, then, won’t be what you can do. It won’t be what you have or what you look like. It will be who you can socialize with, and what those interactions look like.

So let’s learn from Rec Room’s example. Let’s remember who the metaverse is for, and make sure that rich, authentic, and inclusive communities are the centerpiece of what we’re building. Let’s do the work now, rather than waiting until things get worse and bad behavioral patterns have already calcified. And most importantly, let’s do it together, and work with our users to create spaces which truly fulfill the wondrous promise of the “metaverse” for everyone.