Home / Technology / Tech Platforms Fail Kids: No High-Risk Suicide Content Found

Tech Platforms Fail Kids: No High-Risk Suicide Content Found

Summary

  • No tech platform self-identified as high risk for suicide content.
  • Ofcom found inconsistent risk assessments with gaps in child abuse content.
  • Nearly half of girls viewed high-risk suicide content in one week.
Tech Platforms Fail Kids: No High-Risk Suicide Content Found

A recent Ofcom report has exposed a startling lack of self-awareness among major tech platforms regarding harmful content, with none identifying themselves as high risk for suicide or self-harm material. This finding has drawn sharp criticism from campaigners who brand it "abysmal" and unreliable, particularly given recent research indicating children are extensively exposed to such content online.

Ofcom's investigation into platforms' risk assessments, mandated by online safety laws, revealed inconsistencies and significant gaps, especially concerning child sexual abuse and exploitation. The watchdog had to compel firms to revise their assessments, citing "substantive concerns" and "outstanding concerns" about their methodologies and conclusions. Few providers specifically assessed risks related to suicide, self-harm, and hate content.

Campaigners and charities express grave concern, citing research where 49% of girls viewed high-risk suicide content in one week and over 70% of parents worry about children encountering such material. They argue that self-assessment by platforms is insufficient and call for strengthened legislation to hold companies accountable for dangerous products and for the regulator to demonstrate more effective harm reduction.

Disclaimer: This story has been auto-aggregated and auto-summarised by a computer program. This story has not been edited or created by the Feedzop team.
Ofcom found that tech platforms inconsistently assessed illegal and harmful content, with significant gaps in their evaluations of risks related to child abuse and exploitation.
Yes, research indicates that nearly half of girls were exposed to high-risk suicide and self-harm content in a single week, and over 70% of parents are concerned about this issue.
Ofcom is using its powers under the Online Safety Act, including issuing information requests and considering enforcement actions or investigations if platforms fall short in protecting children.

Read more news on