Home / Technology / Charity Banned by AI Over 'Heroin' Mix-Up
Charity Banned by AI Over 'Heroin' Mix-Up
18 Nov
Summary
- A photography charity's Facebook group was banned by AI.
- AI mistook the charity's name for promoting illegal drugs.
- The charity relies on Facebook for audience engagement.

The photography charity Hundred Heroines experienced significant disruption when its Facebook group was repeatedly removed due to AI-driven content moderation. The social media giant's automated systems flagged the group for violating drug-related community standards, stemming from a misinterpretation of the charity's name. This recurring issue, which has happened twice in 2025, caused distress and operational challenges for the organization.
Founder Dr. Del Barrett highlighted how the AI's inability to distinguish context led to these erroneous bans, impacting their ability to connect with their audience. The charity, established in 2020, houses a significant collection of works by female photographers. The reliance on Facebook for outreach meant these bans had a 'devastating' effect on visitor numbers.
Meta, the parent company, emphasizes its commitment to safety and robust measures against illegal content, especially given the opioid crisis. However, the situation with Hundred Heroines exemplifies the 'Kafkaesque' nature of AI moderation when errors occur, leaving users struggling to communicate with human reviewers and protect their online presence. Barrett expressed frustration at having to alter their brand due to a bot's misjudgment.




