Advertisement

Advertisement

Home / Crime and Justice / NSW Cracks Down on Deepfake Abuse: Jail Time for Explicit AI-Generated Images

NSW Cracks Down on Deepfake Abuse: Jail Time for Explicit AI-Generated Images

Summary

  • Legislation introduced to expand offenses for producing, distributing intimate images without consent
  • Sharing of explicit deepfake images of underage Australians doubled since 2023
  • Almost all deepfake AI images circulating were pornographic, with 98% being of women

In a move to combat the growing issue of deepfake abuse, the New South Wales government has introduced new legislation to expand existing offenses for the production and distribution of intimate images without consent. As of August 7th, 2025, young men and teens using artificial intelligence to create sexually explicit deepfakes of women and girls could face serious legal consequences.

NSW Attorney-General Michael Daley warned that "playing with these images on your phone is a serious offense that could land you in jail." The new laws aim to ensure that "anyone who seeks to humiliate, intimidate or degrade someone using AI can be prosecuted." Data from the eSafety Commissioner shows that the sharing of explicit deepfake images of underage Australians has doubled since 2023, with almost all the circulating deepfake AI images being pornographic and 98% targeting women.

Experts have emphasized the severe harm caused by these deepfake images, with Full Stop CEO Karen Bevan stating that "sexual violence of any kind is not acceptable and that this is real harm." NSW Women's Safety Commissioner Hannah Tonkin echoed this sentiment, calling deepfake technology "terrifying" and noting that "women and girls are the main targets."

Disclaimer: This story has been auto-aggregated and auto-summarised by a computer program. This story has not been edited or created by the Feedzop team.

Advertisement

Advertisement

FAQ

The NSW government has introduced new legislation to expand existing offenses for the production and distribution of intimate images without consent, with offenders facing up to 3 years in jail or $11,000 in fines.
According to data from the eSafety Commissioner, the sharing of explicit deepfake images of underage Australians has doubled since 2023.
According to NSW Women's Safety Commissioner Hannah Tonkin, women and girls are the main targets of deepfake images, which she calls "terrifying technology" that can be "weaponised to cause immense harm."

Read more news on