Home / Sports / AI's Dark Side: Sexualized Images of Athletes Emerge
AI's Dark Side: Sexualized Images of Athletes Emerge
10 Feb
Summary
- WSL monitors AI Grok for creating sexualized images of athletes.
- X took action in January after OFCOM investigated Grok.
- Off-platform AI tools also create concerning manipulated images.

Women's football organizations in England are actively monitoring X's artificial intelligence tool, Grok, amid significant concerns over its capacity to generate sexualized images, including those of female athletes. Some top-tier clubs are taking proactive measures, updating guidance for players and staff on safe social media use. X, owned by Elon Musk, reportedly limited Grok's image editing capabilities in January following an investigation by UK regulator OFCOM into its alleged use for creating inappropriate content.
The organization Signify, which aids sports clubs in combating online abuse, has been addressing the Grok issue since late 2025, advising clubs on mitigation strategies. They highlight that the problem is not confined to X, as similar off-platform AI tools are also being exploited to produce alarming images and videos, indicating this will remain a persistent challenge.
While OFCOM's regulatory powers are limited to harms covered by the Online Safety Act, they are supporting government efforts to explore AI regulation. In response to the threats, Signify has introduced a solution to detect and report manipulated images to X, which has reportedly acted decisively on such content. The European Commission also launched an investigation into X in January concerning the dissemination of illegal content, including manipulated explicit images.




