By Mike Pappas (CEO, Modulate) and Tess Lynch (Privacy Associate, Premack Rogers)
For over two decades, online platforms, including games, have enjoyed a regulatory landscape light on both privacy and safety restrictions. The passage of Section 230 in 1996, during the internet’s formative years, offered platforms broad immunity from liability for user-generated content in an effort to enable rapid growth and innovation. A few years later, COPPA was introduced to protect the privacy of children - an admirable and important first step, though leaving plenty of additional questions regarding safety and privacy of older users. In 2016, GDPR was introduced by the EU (and effectively replicated by California as the CCPA), requiring additional privacy controls for users; yet still, platforms remained protected by Section 230 from liability for the increasingly problematic behavior of their users.
Faced with this discrepancy, most platforms chose to skew towards protecting user privacy by monitoring as little user data as possible. But this only exacerbated safety risks, especially in online games, where emotions often run high and users frequently chat live with relative strangers. Solving this problem would not be easy - how does one strike a balance between protecting user privacy, ensuring online safety, and simultaneously keeping the internet free and open for creative expression?
The Coming Regulatory Wave
In recent years, policymakers have begun to take on this challenge in earnest. Increased scrutiny of online spaces - driven by world events including COVID-related misinformation, potential election influence, and the rise of white supremacist extremism online - has led to a new wave of regulation and enforcement intended to hold tech platforms accountable. This wave is global, widely supported by the general population, and even enjoys bipartisan support in the US. Regulators have also begun paying special attention to online games in particular, given their large numbers of underage users and recent reports of common extremist behavior online. Game studios need to pay close attention to these changes to ensure they align with global data protection and content moderation regulations.
Unfortunately, staying on top of the rapidly-shifting legal landscape can be time-consuming and overwhelming. In the US, policymakers have begun debating Section 230 anew, with a focus on the ways this immunity leads to a proliferation of hate speech, harassment, and misinformation on top platforms. Australia, Ireland, the UK, and Singapore have all introduced online safety acts, imposing some of the first explicit requirements on game developers to moderate harmful behavior. The EU got in on the action with the Digital Markets Act (DMA) and Digital Services Act (DSA), which similarly aim to protect users from harmful content and foster fair competition without stifling innovation. Privacy regulation is being revisited and updated in light of tech’s evolution as well - just in the US, KIDS, KOSA, COPPA 2.0, and PRIVCY have all recently been under consideration, while California introduced its Age-Appropriate Design Act, modeled off the UK’s Children’s code.
All these regulations open up a broad set of questions studios need to be on top of - including questions like:
- Who is using my platform? How do I verify ages and identities?
- What responsibility do I have to remove illegal or harmful content?
- How can I even identify that illegal or harmful content in the first place, and do so at scale?
- How can I ensure my moderation decisions are accurate, consistent, and unbiased?
These questions are not merely abstract, either. Studios are already facing steep fines, including more than half a billion dollars paid by Epic Games in a settlement with the FTC. And as the DSA and other regulations begin kicking into force in 2024, experts only expect enforcement to become more common.
So the problem is multifaceted, difficult, urgent, and costly to ignore - but few game studios have the budget or resources to build a full team capable of tracking all these considerations or manually reviewing each user interaction.
Since Modulate has already been wading through these waters, we want to help studios make sense of this evolving landscape. We’ve worked closely with regulators, thought leaders, and industry experts to understand each of these regulations, the surrounding case law and environment, and especially the appetite and focus for enforcement. While we are not lawyers and cannot offer legal advice, we do hold ourselves to the highest standards (as required by our international customers) for our content moderation solutions, and we’ve collected our takeaways into a series of blog posts to help studios seeking to become or remain compliant.
In this post, let’s discuss how upcoming regulations will reshape the Trust & Safety landscape.
Duty of Care and the shift from Knowledge to Harm Standard
One of the greatest frustrations of regulators and citizens alike is that, despite widespread public knowledge of the problems occurring in online games, the studios disclaim liability on the grounds that they are unaware of and unable to monitor harmful content they host.
Now, in many cases, the studios are raising a real and complex technical concern (though tools do exist which can help.) But this doesn’t change the fact that regulators want to see studios take more responsibility for what happens online; thus the introduction of concepts around a “duty of care.”
Duty of care states that platforms have a duty to care for their users and are responsible if harm is inflicted on them, regardless of whether the platform was aware of it or not. In legal parlance, this represents a transition from a “knowledge standard” for liability to a “harm standard” for liability.
While this new approach decreases the chances of a studio being punished for not taking action against minor offenses, it also means that studios must now proactively minimize any harm on their platform, as they could be liable for such harms even if no user ever reported them. The focus is on harm reduction, with multiple bills including provisions to the effect that the letter of the regulation could be ignored if doing so is in the best interests of the user, particularly children.
What exactly are studios responsible for? California’s Age-Appropriate Design Act prioritizes the risk to children of non-private data, especially where that data could be used or accessed by advertisers, bullies, or criminals. The various Online Safety Bills add a duty with respect to illegal content - including pro-terrorist content, child sexual abuse material, and grooming efforts by pedophiles and extremist groups; as well as more ‘mundane’ but still harmful behaviors like bullying, harassment, and suggestion of self-harm.
A New Enforcement Regime
Regulatory enforcement of online safety and privacy is becoming more robust than ever before. The recent settlement between Epic Games and the FTC for over half a billion dollars was the largest such settlement in the industry and demonstrates the FTC’s new proactive and activist approach. This could be seen clearly in the Epic Games filings, where the FTC pulled out all the stops it could to ensure Epic would be punished for the full suite of alleged violations of the spirit of the law, not just violations against the precise letter of what was written.
What gives the FTC this power? Section 5 of the FTC Act prohibits “unfair or deceptive acts or practices in or affecting commerce.” This act grants the FTC broad authority to impose penalties or restrictions on studios found to be acting unfairly; and in turn, “unfair or deceptive acts or practices” can itself be interpreted rather broadly, including situations where the FTC deems consumers to be misled or treated unreasonably in the default settings, meaning the FTC has a powerful hook to apply sweeping penalties - and has demonstrated, in the Epic Games filing, a willingness to use this tool.
Adding to this, historically, filing suit against online studios for safety or privacy violations has been the province of Attorneys General, who often had competing priorities. However, the introduction of new regulations is changing this. The Australian Online Safety Act and California’s Age-Appropriate Design Act both establish new agencies with a singular focus on enforcing codes of practice for online platforms, including games. This newfound emphasis on online safety and privacy enforcement is a clear indication that platforms must take compliance seriously or face significant penalties.
Several of the largest regulations under consideration - including the EU’s DSA, which has been passed and will go into effect on Jan 1, 2024 - require studios to produce regular transparency reports. These transparency reports go beyond compliance documents required by existing data protection laws like GDPR; while they require some similar disclosures around data usage and consent, many now require additional details surrounding each platform’s content moderation practice. These new details can include topics like:
- Relevant policies and procedures;
- Rules for identifying and removing illegal content;
- The tools studios use for moderation;
- How much harmful content is on their platform;
- And the accuracy and action rates for automated moderation, user reports, and player appeals.
Many of these are topics which online studios have been reluctant to share in the past, fearing that providing statistics could encourage regulatory attention or single out their platform as “worse” than competitors. As a result, studios will need to carefully redesign their processes to collect and track the necessary information - especially relating to prevalence and exposure to harmful content, which some studios don’t track at all yet!
It’s important to keep in mind that content moderation, data protection, and transparency requirements can all benefit your platform if you’re making a genuine effort to minimize harmful content and foster a resilient community. All platforms will be subject to these obligations, and regulators are making efforts to provide guidance around what good looks like. Regulators and consumers alike know that harmful content is widespread and that moderating it is extremely difficult; they are looking for evidence that the studios are taking the problem seriously, not expecting studios to magically solve everything overnight.
As the world of online gaming evolves, so too do the regulations that govern it. These upcoming regulations will also require most studios to make several other comprehensive changes, including:
- Updating terms of service and codes of conduct
- Building out new moderation and monitoring tools
- Implementing UI changes to increase privacy by design
- Establishing internal teams to produce transparency reports and work with regulators as needed
…and much more. This is a mammoth task, and best not to take on alone.
It’s important for online gaming platforms to consult with legal and regulatory experts to ensure that they are taking the appropriate steps to comply with global regulations.
When it comes to content moderation, balancing safety with privacy, and other practical concerns, Modulate can help. Sign up for our newsletter to stay up-to-date on all of the latest regulations and what impacts we expect them to have for the games industry, or read more about our proactive, voice-native moderation solution, ToxMod, and how it can support your team on your compliance journey.
We also encourage well-intentioned studios to share their approaches and work together with others in the industry to establish strong and achievable standards - which not only offsets the risk of “complying with regulation” becoming an untenable expenditure, but also helps demonstrate to regulators and consumers that the industry is taking this problem seriously and can be trusted to tackle it effectively.
Finally, for those looking to get directly involved, we recommend joining the Fair Play Alliance to share best practices and hear from others in the industry about how to make online games which are safe, private, and fun by design.