Ofcom updates on implementation timetable for Online Safety Act 2023

November 17, 2025

Ofcom has issued an update on its implementation timetable for the remaining phases of the Online Safety Act 2023. The timetable is as follows:

Now until the end of 2025

Guidance on a safer life online for women and girls

After consulting in February 2025, it will publish its final guidance for improving women and girls’ online safety on 25 November 2025. Around 18 months later, it plans to publish an assessment of how providers are keeping women and girls safer on their services.

Supporting information gathering on deceased child users.

In December, Ofcom will consult on its draft guidance for categorised services about the disclosure of information about a deceased child’s use of their platform. It expects these duties, which only apply to categorised services, to come into force in late 2026 when it finalises the guidance. Ofcom will also publish an update in December to its information powers guidance, which will cover its final guidance on data preservation notices to support coroners’ investigations into a child’s online activity and updated guidance on coroners’ information notices. 

Industry fees

Providers whose “qualifying worldwide revenue” is above a threshold to be set by the Secretary of State for Science, Innovation and Technology have a duty to pay fees. Ofcom expects the fees regime to come into force shortly, subject to completion of the Parliamentary process. Once in force, there will be a four-month notification window for relevant platforms to submit their revenue data to Ofcom for the 2026/27 charging year. 

From January 2026 onwards

Super-complaints

Following its consultation in September 2025, it will publish its final guidance in February 2026. This will provide information to eligible entities on how to raise to Ofcom’s attention issues that arise across services, or on one service in exceptional circumstances (in line with the Act and government regulations). Ofcom is ready to handle super-complaints when the regime comes into force.

Technology notices

Under section 121 of the Act, Ofcom can in limited circumstances require providers of user-to-user or search providers to use accredited technology to identify and prevent users encountering child sexual exploitation and abuse (CSEA) and/or terrorism content on their service. It has consulted on policy proposals for minimum standards of accuracy for accredited technology that deals with CSEA and/or terrorism content. It has also consulted on draft guidance to providers on how it proposes to exercise its technology notice function. It will publish its advice to the Secretary of State and its final guidance to providers by April 2026. After the Secretary of State has published the minimum standards of accuracy, Ofcom (or another person appointed by Ofcom) will facilitate a process to accredit technologies as having met them.

Media literacy statement of recommendations

In September 2025, Ofcom published its consultation on draft recommendations for how online platforms should promote media literacy. It set out its expectations for how regulated services should give people the right skills, understanding and tools to take control of their online experience, gain digital confidence, and critically engage with the content they see. It will review responses to the consultation and aim to publish its final statement of recommendations in spring 2026. 

Summer 2026

Categorisation and additional duties on categorised services

The UK government’s secondary legislation setting the thresholds that will determine which services are categorised under the Act was subject to legal challenge, which concluded in August 2025. Ofcom has considered the implications of the judgment and has adjusted its plans for the categorisation register and the consultation on the additional duties that will apply to categorised services accordingly. It will now be carrying out a representations process in early 2026, which will give the services that it believes meet the threshold conditions an opportunity to comment on its provisional decisions before it finalises the register. Subject to the outcome of this process, it plans to publish the categorisation register and consult on the additional duties that apply to categorised services around July 2026. This consultation will cover duties relating to fraudulent advertising, terms of service, user empowerment, ID verification, news publisher content, journalistic content, and content of democratic importance. It will publish its final policy statements as soon as possible, by mid-2027. Where it can, it will bring forward final statements ahead of the main package to maintain pace. It is already planning on publishing its final statement on the terms of service guidance in early 2027.

Transparency regime

Ofcom issued final guidance on transparency reporting in July 2025. After it has published the register of categorised services, Ofcom will issue notices to categorised services within a few weeks, requiring them to publish transparency reports. The aims for the reports to give meaningful insights into safety measures and help users to make informed choices. All categorised services will be required to publish their first reports by summer 2027. Ofcom will then start publishing its own transparency reports based on those reports from platform reports from 2028.  

Age assurance statutory report

As required by the Act, it will publish a report by the end of July 2026 assessing how services have used age assurance and how effective it has been when complying with their duties under the Act. Ofcom recently published a Call for Evidence asking for input on this topic. 

Later in 2026

Additional safety measures to improve Ofcom’s codes

In June 2025 Ofcom consulted on a targeted set of additional safety measures designed to make online services safer, building on its Illegal Harms and Protection of Children Codes. These include measures to prevent illegal content going viral, tackling harms at the source, and adding more protections for children in relation to livestreams.  Ofcom is currently considering consultation responses. It will publish its statement by autumn 2026 and will provide an update with more specific detail on timings once it has considered all the consultation responses.

Content harmful to children statutory report

Ofcom will review and report on the incidence and severity of content harmful to children and publish its findings in October 2026. This will include advice about whether Ofcom thinks that changes to primary priority content and priority content are appropriate. 

Online Information Advisory Committee

The Committee will publish its first statutory report by 1 November 2026, along with updates on individual projects.

January 2027

App stores statutory report

Ofcom will publish a report on the use of app stores by children by January 2027. This will assess the role app stores play in children encountering harmful content and evaluate the use and effectiveness of age assurance by app store providers. This will support the Secretary of State in deciding if app store providers should be brought into scope of the Act. It recently published a Call for Evidence asking for input on this topic. 

Ofcom’s progress report

Ofcom has also published a progress report on its activities so far. Among other things it has completed its Illegal Harms and Protection of Children codes and guidance on implementing highly effective age assurance and started driving compliance with these core requirements. It has also dealt with issues such as setting up its fees, super-complaints, and transparency reporting functions. In recent months, the UK government has announced that it will add to the “priority” offences under the Act. This includes content encouraging or assisting serious self-harm, cyberflashing, and online porn showing strangulation or suffocation. Ofcom is considering how it can give effect to these changes as soon as possible. In early December, Ofcom will publish a report providing an overview of the industry response since duties under the Act have come into force and its headline online safety priorities for 2026.

Government reaction

SCL readers may be wondering how this is taking so long when the Act received Royal Assent back in 2023.  The government thinks the same and has written to Ofcom expressing its “deep disappointment in the delays to the overall implementation of additional duties on categorised services that have been set out in Ofcom’s roadmap…we should not be willing to accept delay: the Online Safety Act was a long time coming and people across the country have been waiting too long for the protections it brings” as well as being concerned that “delays in implementing duties, such as user empowerment, could hinder our work to protect women and girls from harmful content and protect users from antisemitism”. It also says that Ofcom should do everything possible under the Act to tackle antisemitic content and hate speech online.