Ofcom sets out online safety priorities and roadmap

May 13, 2026

Ofcom has set out its online safety priorities, highlighting that it published its Plan of Work for 2026-27 in March.

It is enabling compliance with the Online Safety Act 2023 (OSA) while implementing and operationalising remaining parts of the OSA, and preparing for new legislative and regulatory online safety initiatives being introduced by the UK government. It also points out that the OSA covers 130 priority offences and more than 100,000 services come within scope. This means that it needs choose where to concentrate its efforts.

Implementing the OSA

This year, Ofcom must publish statutory reports on the effectiveness of age assurance, trends in content harmful to children and app stores. It has provided advice on Technology Notices this month. It will publish the register of categorised services in the summer, with a consultation on the duties applying to these services, including a draft code of practice on fraudulent advertising. It will also publish updated codes in the autumn with additional safety measures, including AI and other automated content moderation tools. It has also decided to bring forward our crisis response measure to the summer. Ofcom has also published a Roadmap update with more information.

New legislation

Recent or upcoming legislation will need substantial policy work from Ofcom to bring them into effect. It is working on new priority offences under the OSA, for example launching a new consultation in March to expand the Codes to cover encouraging serious self-harm and cyberflashing.

The Crime and Policing Act received Royal Assent at the end of April.  It includes duties to take down non-consensual intimate imagery in 48 hours and report it to a central registry, automatic Data Preservation Notices if a child dies, and a power for the government to bring more GenAI services into the OSA’s scope. In addition, the government has been consulting on children’s online wellbeing and if it adopts its proposals, they will create new duties and require comprehensive policy work to implement.

Priorities for action

The first parts of the OSA came into force in March 2025 and in the first year of regulation Ofcom’s over-riding priority was protecting children with much of its work focusing on age checks.  

Ofcom says that it needs to balance tackling the most widespread harms, such as fraud, and addressing the less frequent but even more severe harms, like child sexual abuse and grooming. It also needs to balance dealing with household names versus smaller services which are accessed less frequently but pose the highest risk of serious or fatal harm. It uses the following criteria: how many people use a service, how serious the harm can be, whether users are especially vulnerable, whether there are widespread problems that keep happening, and how services act when risks are found.

For 2026-27, as well as continuing to emphasise protecting children, Ofcom will focus effort on countering terrorism and illegal hate, and improving women’s and girls’ safety online, as well as dealing with wider issues under the Online Safety Act.

Improving protections for children

Ofcom’s enforcement programmes on child sex abuse material and age assurance will continue. It has told major services used by children to meet clear expectations on effective age checks, protections against grooming and child sexual abuse material, safer feeds and recommendations, and proper testing and risk assessment before new products are launched. Services responded at the end of April, and Ofcom will report later this month.

Removing illegal content, especially hate and terror

Ofcom says there are increasing levels of hate speech online, including a recent surge in antisemitic content. As a result, Ofcom will act on illegal hate and terrorist content. It will update on this compliance programme shortly.

Protecting women and girls

Intimate image abuse disproportionately affects women and girls. Ofcom will build on enforcement action this year into sexual deepfakes, a nudification site and image-based sexual abuse. This month it will set a new technical standard (hash-matching to prevent the upload of known non-consensual intimate images. This requires services to move beyond reactive takedown and ensure proactive protections work effectively at scale, including as new forms of abuse emerge. Ofcom will continue to prioritise taking enforcement action against services that fail to do so. In addition, it will collect evidence for a report in 2027 on how tech firms have applied its guidance on protecting women and girls. The report will give details about the progress made towards reducing online gender-based harms with a view to directing future action.

How Ofcom facilitates change

In brief, Ofcom uses three means to create change. The first is awareness raising. Tens of thousands of sites come within the scope of the OSA but they do not pose high risks. However, they still need to carry out risk assessments so Ofcom provides tools to help. The second is direct supervision of 40 of the highest profile services in sectors from social media to gaming, search, dating and AI. Supervision involves developing a detailed understanding of companies’ business models and operating systems, them checking their understanding of what they need to so, and Ofcom setting out bespoke requirements for change company by company. The third and most visible element is enforcement. Ofcom can fine companies up to 10% of their qualifying worldwide revenue (or £18 million, whichever is higher) for breaches of the OSA.