Online age checks now enforced

July 30, 2025

Since 25 July, the provisions of the Online Safety Act 2023 requiring measures to be place to protect children have been in force.  Among other things, age checks are now enforced to prevent children from accessing porn, self-harm, suicide and eating disorder content.

Over the last month, the UK’s biggest and most popular adult service providers – including Porn Hub and many smaller sites have committed to deploying age-checks across their services. Ofcom says that this means it will be harder for children in the UK to access online porn than in any other OECD country.

Ofcom has warned that it is ready to enforce against any company which allows pornographic content and does not comply with age-check requirements. Ofcom is extending its existing age assurance enforcement programme – previously focused on studio porn services – to cover all platforms that allow users to share pornographic material, whether they are dedicated adult sites or other services that include pornography.

Ofcom will be checking compliance from 25 July and is planning to launch any necessary investigations into individual services in the near future. These would add to 11 investigations already in progress.

Under Ofcom rules, sites that allow other forms of harmful content must also now have effective age checks. Ofcom is launching a new age assurance enforcement programme to monitor the response from industry. This programme will specifically target sites dedicated to the dissemination of other harmful content, including self-harm and suicide, eating disorder and extreme violence/gore.

Ofcom’s codes also dictate that online services should act to protect children from dangerous stunts or challenges and misogynistic, violent, hateful or abusive material. It has launched an extensive monitoring and impact programme with the aim of holding platforms accountable. This will be focused on the website and apps where children spend the most time including Facebook, Instagram, TikTok and YouTube. This includes:

  • a comprehensive review of these platforms’ efforts to assess risks to children, which must be submitted to Ofcom by 7 August at the latest. Ofcom will report on its analysis of these assessments later this year
  • scrutinising these platforms’ practical actions to keep children safe – details of which must be disclosed to Ofcom by 30 September. Ofcom will focus on these areas in particular: whether they have effective means of knowing who their child users are; how their content moderation tools identify content harmful to children; how effectively they have configured algorithms so that the most harmful material is blocked in children’s feeds; and how they have prevented children from being contacted by adult strangers.
  • tracking children’s online experiences to judge whether safety is improving in practice – through Ofcom’s ongoing programme of children’s research and consulting with children through new work with the Children’s Commissioner for England.
  • swift enforcement action if evidence suggests that platforms are failing to comply with their child safety duties.