Home / Crime and Justice / Minnesota Bans AI Nudification Apps First
Minnesota Bans AI Nudification Apps First
1 May
Summary
- Minnesota is the first state to ban nudification apps.
- Developers face damages and product blocks if sued.
- Fines up to $500,000 per fake nude can be imposed.

Minnesota is set to become the first U.S. state to ban apps designed to create non-consensual nude images of individuals. The new law, which is anticipated to be signed by Governor Tim Walz and take effect this August, targets the technology that makes it easy to sexualize images of real people.
Developers of websites, apps, or software that facilitate the "nudification" of images now face significant legal repercussions. Victims can sue for extensive damages, and offending products may be blocked within Minnesota. The state's attorney general also has the authority to impose fines of up to $500,000 for each fake AI nude identified.
Introduced by Senator Erin Maye Quade, the bill gained bipartisan support following reports of a Minnesota man creating fake nudes of over 80 acquaintances. This legislation aims to provide legal recourse for victims and address the growing concern over AI-generated non-consensual intimate imagery, particularly affecting women and children.
The law includes an exemption for products requiring significant user technical skill for image manipulation, focusing instead on the ease of harm enabled by mainstream nudification apps. Advocates worked with tech companies during drafting to avoid unintended impacts on legitimate software like Photoshop.
Concerns remain about enforcing the ban against apps operated overseas, such as DeepSwap, which has claimed bases in Hong Kong and Dublin. There is also apprehension that future federal deregulation efforts could undermine state-level safeguards. Prominent AI tools like Elon Musk's Grok are also under scrutiny for generating non-consensual images, with law enforcement in Nashville recently making an arrest linked to Grok-generated child abuse material.