Home / Crime and Justice / Meta Knew Instagram DMs Were Risky Years Ago
Meta Knew Instagram DMs Were Risky Years Ago
25 Feb
Summary
- Meta was aware of risks like explicit images in DMs nearly six years prior.
- Nearly 20% of teen users saw unwanted nudity on Instagram.
- Lawsuits allege tech firms prioritized engagement over teen safety.

In a federal lawsuit examining the addictive nature of social media, Meta executives faced questions regarding the delayed implementation of safety features on Instagram. Internal communications from August 2018 revealed awareness of significant risks, including "horrible things" like explicit images, in direct messages to teens. Despite this knowledge, a nudity filter for direct messages was only introduced in April 2024.
Testimony highlighted concerning statistics: 19.2% of surveyed 13- to 15-year-olds had encountered unwanted nudity or sexual images on the platform, and 8.4% reported seeing content related to self-harm. Prosecutors are focused on the nearly six-year gap between Meta's acknowledged awareness of these dangers and the rollout of protective measures.
The ongoing legal battles, including cases in California and New Mexico, aim to hold major tech companies accountable. Plaintiffs argue that platforms are designed to maximize screen time, fostering addiction in minors, and that companies prioritized user engagement and growth over the safety of their youngest users.




