Home / Technology / UK Regulators Demand Safer Kids' Social Media
UK Regulators Demand Safer Kids' Social Media
12 Mar
Summary
- Social media firms must improve age checks by April 30.
- Regulators cite concerns over algorithmic feeds and child safety.
- Companies face significant fines for non-compliance with new rules.

Britain's media and privacy watchdogs are pressing major social media companies to bolster protections for child users. Ofcom and the Information Commissioner's Office (ICO) have expressed deep concern over algorithmic feeds that expose children to potentially harmful or addictive content, stating that companies are failing to enforce their own minimum age policies.
By April 30, platforms including Facebook, Instagram, Roblox, Snapchat, TikTok, and YouTube must present plans to improve age verification, limit contact from strangers, create safer feeds, and cease testing new products on minors. The ICO separately urged these companies to adopt "modern, viable" tools to prevent under-13s from accessing services not intended for them.
Ofcom possesses the authority to levy fines of up to 10% of a company's qualifying global revenue, while the ICO can impose penalties of up to 4% of global annual turnover. This follows a recent fine of nearly 14.5 million pounds imposed on Reddit for inadequate age checks and unlawful processing of children's data.




