Home / Technology / Roblox Faces Scrutiny Over Child Safety Concerns
Roblox Faces Scrutiny Over Child Safety Concerns
10 Feb
Summary
- Minister Anika Wells seeks urgent meeting with Roblox over child safety.
- Predators reportedly target children with explicit and suicidal material.
- New social media laws aim to protect minors under 16 on platforms.

Communications Minister Anika Wells has urgently requested a meeting with the gaming platform Roblox due to alarming reports of children being exposed to sexually explicit and suicidal material. She expressed deep concern over predators reportedly grooming young users on the platform, exploiting their innocence. These issues persist despite Roblox's two-year engagement with eSafety and recent compliance measures under Australia's social media minimum age restrictions, which began on December 10, 2025.
Roblox, a vast ecosystem of user-created 'experiences', has stated that 60 percent of its Australian daily active users have completed age checks. The platform has also disabled direct chat and voice functions for Australian children as part of its compliance with the new law. However, eSafety Commissioner Julie Inman Grant remains highly concerned by ongoing reports and stated that Roblox must immediately enhance measures to block predator access and prevent exposure to harmful content.
The government is exploring further actions, including potential enhancements to existing powers and the development of a digital duty of care legislation. This proposed law would compel large online platforms to proactively prevent foreseeable user harms. Codes of practice focusing on age-restricted material, including self-harm content, are set to take effect on March 9, 2026, and will apply to Roblox. Non-compliance with the social media ban could result in fines of up to $49.5 million.




