Privacy
  • privacy
  • enforcement
  • children

ICO investigations show child-data enforcement is moving upstream into platform design

The UK ICO’s investigations into TikTok, Reddit, and Imgur signaled a broader regulatory focus on recommender systems, age assurance, and how platforms structurally handle children’s data.

What happened

In March 2025, the UK Information Commissioner’s Office announced three investigations into how TikTok, Reddit, and Imgur use UK children’s personal information. The TikTok investigation was framed around how data from 13–17-year-olds may be used in recommender systems, while the Reddit and Imgur investigations focused on children’s data use and age-assurance measures.

Why it matters

The important shift here is regulatory focus. This is not only about whether a platform published the right privacy notice. It is about whether recommendation logic, age checks, and product defaults are exposing children to preventable harms. That moves child-data enforcement further upstream into platform design, ranking systems, and onboarding controls.

Who is affected

  • social and video platforms with large youth audiences
  • product teams that still treat age assurance as a policy edge case
  • regulators building the next wave of child-safety and privacy enforcement strategy

What to watch next

  • whether these investigations lead to formal findings, penalties, or undertakings
  • whether recommender-system scrutiny becomes more central in child-data cases
  • whether platforms respond by changing defaults before enforcement lands

Verification status

This briefing is based on an official ICO investigation announcement.