Home / Technology / Instagram Parents Alerted to Teen Self-Harm Searches
Instagram Parents Alerted to Teen Self-Harm Searches
26 Feb
Summary
- Parents may be alerted if teens search suicide or self-harm terms.
- Feature rolls out as Meta faces trials over alleged child harms.
- New AI conversations about self-harm may also trigger alerts.

Instagram is launching a new feature that will alert parents if their teens repeatedly search for terms associated with suicide or self-harm. This notification system is exclusively for parents enrolled in Instagram's supervision program. The platform already blocks such content and redirects users to helplines.
This development occurs while Meta is engaged in two significant legal battles. One trial in Los Angeles examines claims that Meta's platforms are designed to be addictive and harmful to minors. Another trial in New Mexico addresses allegations of Meta's failure to protect children from sexual exploitation. Thousands of families and institutions are suing Meta and other social media companies.
Meta's executives, including CEO Mark Zuckerberg, have contested the assertion that their platforms cause addiction. Zuckerberg has stated that scientific evidence has not definitively proven social media's link to mental health issues. The company also plans to implement similar notifications for parents regarding their children's interactions with AI, particularly concerning conversations about suicide or self-harm.




