Modulate prepares studios for regulatory compliance
With ToxMod's voice-native proactive moderation and our team’s expert insight and knowledge, Modulate helps developers and publishers unlock regulatory compliance.
Experts in online safety policy for game studios
Remove illegal content
Improve transparency reports
Encourage positive play
Expedite report analysis
Upcoming regulations you need to know
Digital Services Act
The Digital Services Act (DSA) is a European law designed to require online platforms to combat harmful and illegal behaviors and foster transparency across the industry.
The DSA outlines a variety of obligations, with some of the most notable being a requirement of regular transparency reports to disclose the effectiveness of each platform’s moderation efforts; as well as banning users who repeatedly upload or share illegal content.
January 1, 2024
6% of annual revenue
ToxMod can identify all the worst behavior across your platform, helping you stay compliant and genuinely improve your community experience.
Children's Online Privacy Protection Act
The Children's Online Privacy Protection Act (COPPA) imposes controls on online platforms that are targeted towards kids or have a lot of underage users.
Among other requirements, COPPA requires parental consent before platforms can process PII (personally identifiable information) of children under the age of 13. COPPA is designed specifically to protect children, and allows for looser interpretations when it’s in the best interest of the child.
"COOPA 2.0" is currently on the Senate floor and would extend these protections to teens as well.
April 21, 2000
$43,000 per impacted child
Modulate limits our collection of PII wherever possible, and we restrict all collected data to be used solely for moderation in to improve user safety. We're certified COPPA-compliant.
UK Online Safety Bill
Several countries currently have Online Safety Bills implemented or under consideration. The UK’s Online Safety Bill requires platforms to proactively remove illegal content, provide expansive explanations in their terms of service regarding moderation practices, and limit the risk of underage users accessing adult or otherwise harmful content.
TBD 2023, pending Royal Assent
10% of annual revenue
ToxMod proactively monitors conversations across the ecosystem and provides a categorized, prioritized list of the worst harms back to your team, allowing you to engage with the most severe content first.
eSafety Industry Codes
Australia's Online Safety Bill created the eSafety Commission, an office solely devoted to enforcing online safety standards.
The eSafety Commission is empowered to publish "Industry Code" expectations for a variety of industries, including game developers. The Industry Code relevant to games was initially rejected and is now being reworked to require more proactive efforts by platforms.
January 23, 2022
Variable, at eSafety Commission's broad discretion
ToxMod can ensure studios become aware of illegal content even if players don’t report it, and can also augment player reports with substantially more context to help platforms take action efficiently and consistently.
Code of Practice for Online Safety
The Singapore Code of Practice for Online Safety is a set of guidelines aimed at promoting safe and responsible online behavior. It outlines measures for online service providers to prevent and combat harmful content and activities on their platforms while ensuring user privacy and freedom of expression.
July 18, 2023
S$1 Million fine, plus S$100,00 per day
ToxMod's proactive moderation technology helps studios surface CSAM and terror content in near real-time, and connects with player report systems for speed & transparency.