Home / Crime and Justice / Gaming Giants Face Child Safety Probe
Gaming Giants Face Child Safety Probe
22 Apr
Summary
- Online gaming platforms asked to detail child protection measures.
- Regulator cites risks of grooming, radicalization, and extortion.
- Platforms face penalties for non-compliance with new notices.

Australia's eSafety regulator has mandated that popular online gaming platforms, such as Roblox, Minecraft, Fortnite, and Steam, must clarify their child protection strategies. Legally enforceable transparency notices have been issued, requiring these platforms to outline their systems, staffing, and safety measures in accordance with cybersecurity protocols.
Commissioner Julie Inman Grant highlighted that gaming-adjacent services, including encrypted messaging, are often the initial contact points for offenders engaging in grooming, sexual extortion, and radicalization of children. She noted that nine in ten Australian youths aged eight to seventeen participate in online games, making these environments targets for predatory adults seeking to embed extremist narratives or exploit children.
This regulatory action arrives as gaming platforms face heightened scrutiny regarding their capacity to detect online threats to minors. Traditional social media policing methods are less effective against real-time chats with unknown users on many gaming platforms. Roblox recently settled with Alabama and West Virginia for over $23 million concerning its alleged failure to protect young users and is also facing over 140 lawsuits in U.S. federal courts related to child sexual exploitation.