Home / Crime and Justice / New Law: 48 Hours to Remove Non-Consensual Images
New Law: 48 Hours to Remove Non-Consensual Images
19 Feb
Summary
- Firms must remove non-consensual intimate images within 48 hours of reporting.
- Failure to comply risks substantial fines or UK service bans.
- Government targets AI-generated fake nudes and online abuse.

New legislation mandates that social media firms must remove non-consensual intimate images online within 48 hours of receiving a report. Companies failing to meet this deadline risk substantial fines, potentially up to 10% of their global revenue, or even a block of their services within the UK.
This initiative is part of a broader government effort to combat online abuse and protect women and girls. It follows government pressure on Elon Musk's AI assistant Grok to disable a feature used for creating fake nude images. The Prime Minister emphasized this as a crucial step in the ongoing fight against online violence.
The proposed amendment to the Crime and Policing Bill aims to enhance safety measures, especially with the rise of AI-generated explicit content. Victims will reportedly only need to report an image once for removal across platforms and automatic deletion upon re-upload. Communications regulator Ofcom is considering classifying such content similarly to child sexual abuse material, facilitating quicker removal.
Ministers are also exploring ways to block access to websites hosting non-consensual intimate images that may fall outside the scope of existing legislation. The government stated that the era of tech firms having a 'free pass' is over, ensuring that online spaces are safer for women and girls.




