Home / Technology / Aussies Face Digital Age Check for Adult Content
Aussies Face Digital Age Check for Adult Content
5 Mar
Summary
- Online Safety Act mandates age verification for adult content access.
- Platforms like social media, search engines must verify user ages.
- Non-compliance risks fines up to $49.5 million for companies.

New regulations under Australia's Online Safety Act take effect this Monday, requiring digital platforms to verify users' ages. This measure targets access to adult content, such as pornography, self-harm material, and extreme violence. Social media sites, app stores, search engines like Google and Bing, and even AI platforms must implement these age checks.
The legislation also extends to internet infrastructure providers, including phone and broadband services. Companies have flexibility in their verification methods, which can range from parental confirmation and photo ID to facial estimation or credit card checks. Failure to comply could result in significant penalties, with fines reaching up to $49.5 million.
Experts note that while these measures are logical for child protection, they may be easily circumvented using VPNs, a tactic observed when similar laws were introduced in the UK. Despite potential workarounds, the eSafety Commissioner's office suggests various acceptable age assurance methods for platforms to adopt.




