Regulators warn tech platforms that age assurance must actually work

March 18, 2026

The ICO and Ofcom have warned the major platforms that minimum‑age policies must be properly enforced using robust, effective age assurance mechanisms. They have written to large tech companies setting out clear expectations, and signaling that enforcement action will follow where platforms do not comply.

What has Ofcom said?

Ofcom is calling on user‑to‑user and search services to take immediate steps across four key areas:

  • Make minimum age rules enforceable in practice: Ofcom’s research has found that 72% of children aged 8–12 are routinely accessing platforms with a stated minimum age of 13.  Even though the Online Safety Act 2023 (OSA) does not expressly mandate minimum‑age checks, Ofcom expects platforms to deploy highly effective age‑assurance mechanisms, not merely self‑declaration, to meet their existing statutory duties around child safety.
  • Introduce failsafe anti‑grooming controls: Platforms must ensure that unknown adults cannot contact children. This includes using age assurance to verify both parties’ ages, alongside other protective technical controls.
  • Make algorithmic feeds safer for children: Ofcom identifies recommendation systems as the main pathway to harm for children.  It is issuing detailed information notices to large services to scrutinise how recommender systems work, how harmful content is prioritised or suppressed, and the adequacy of existing mitigations. It has stated that enforcement action will follow if platforms’ systems present unmanaged risks.
  • Stop “testing on children” when deploying new AI features: New generative AI tools are being rolled out rapidly—and often used by children—without transparent risk assessment.  Under the OSA, platforms must assess risks associated with significant product updates before deployment. Ofcom expects to be notified where such assessments identify significant child‑safety issues.

Platforms must report back to Ofcom by 30 April 2026 explaining what actions they will take. Ofcom is encouraging firms to publish their plans. In May 2026, it will publish its assessment of industry responses, potential next steps for enforcement, and new research on how children’s online experiences have changed in year one of the OSA regime.

The ICO’s approach

In a parallel move, the ICO has issued an open letter to social media and video‑sharing services operating in the UK, requiring them to strengthen age‑assurance measures and prevent young children from accessing services that are not designed for them. It says that platforms with minimum age policies must stop relying on self‑declared ages and adopt available and viable age‑assurance technologies in line with its Children’s Code.

The ICO has also written directly to certain major platforms asking them to demonstrate how their current systems meet these expectations. The ICO is clearly prepared to escalate – its recent fines include:

  • Reddit – £14.47 million, and
  • MediaLab (Imgur) – £247,590.

Both cases concerned failures to implement age assurance and unlawful processing of children’s data that exposed under‑age users to potentially harmful content.

The ICO remains concerned about profiling and automated recommendations that lead children to harmful content or contribute to addictive user journeys. This forms part of its ongoing Children’s Code enforcement programme and aligns with wider European scrutiny of recommendation systems, including the European Commission’s ongoing investigation into Shein.

Looking ahead

Ofcom and the ICO will publish an updated joint statement in March 2026 clarifying the intersection between online safety duties and data‑protection requirements as they relate to age assurance. The industry also awaits the outcome of the UK government’s consultation on children’s online wellbeing, which could introduce additional statutory requirements. The UK parliament recently voted down amendments to the Children’s Wellbeing and Schools Bill that would have required all regulated user‑to‑user services to introduce “highly effective” age assurance for under‑16s across the board. However, MPs did support a government amendment enabling faster intervention on potential social‑media bans or restrictions once the consultation concludes.