Online Safety Act receives Royal Assent

October 30, 2023

The Online Safety Act 2023 has finally received Royal Assent, after completing the parliamentary process in September.

The Act imposes on duties on social media regarding the content they host. These duties include:

  • removing illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm;
  • prevent children from accessing harmful and age-inappropriate content including pornographic content, content that promotes, encourages or provides instructions for suicide, self-harm or eating disorders, content depicting or encouraging serious violence or bullying content;
  • enforce age limits and use age-checking measures on platforms where content harmful to children is published;
  • ensure social media platforms are more transparent about the risks and dangers posed to children on their sites, including by publishing risk assessment; and
  • providing parents and children with clear and accessible ways to report problems online when they do arise.

In addition to protecting children, the Act also aims to empower adults to have enhanced control of what they see online. It provides three layers of protection for internet users which aim to:

  • make sure illegal content is removed;
  • enforce platforms’ terms and conditions;
  • offer users the option to filter out content, such as online abuse, that they do not want to see.

If social media platforms do not comply with these rules, Ofcom has power to issue fines of up to £18 million or 10% of their global annual revenue, whichever is higher.

The Act also aims to address violence against women and girls. Through the Act, it will be easier to convict someone who shares intimate images without consent and new laws will further criminalise the non-consensual sharing of intimate deepfakes.

The change in laws also now make it easier to charge abusers who share intimate images. Criminals found guilty of this base offence will face up to six months in prison, but those who threaten to share such images, or shares them with the intent to cause distress, alarm or humiliation, to obtain sexual gratification, could face up to two years in prison.

Most of the Act’s provisions come into force in two months’ time.

Ofcom has announced that it will give guidance and set out codes of practice on how in-scope companies can comply with their duties, in three phases, as set out in the Act.

Phase one: illegal harms duties

It will publish draft codes and guidance on these duties for consultation on 9 November 2023, including:

  • analysis of the causes and impacts of online harm, to support services in carrying out their risk assessments;
  • draft guidance on a recommended process for assessing risk;
  • draft codes of practice, setting out what services can do to mitigate the risk of harm; and
  • draft guidelines on Ofcom’s approach to enforcement.

Following consultation, Ofcom plans to publish a statement on its final decision in Autumn 2024. the codes of practices will then be submitted to the Secretary of State for Science, Innovation and Technology, and subject to approval, laid before Parliament.

Phase two: child safety, pornography and the protection of women and girls

Child protection duties will be set out in two parts: guidance on age assurance, which will be relevant to all services in scope of Part 5 of the Online Safety Act, and codes of practice relating to protection of children.

Alongside this, Ofcom plans to consult on:

  • analysis of the causes and impacts of online harm to children; and
  • draft risk assessment guidance focusing on children’s harms.

It expects to publish draft guidance on protecting women and girls by Spring 2025, when it will have finalised its codes of practice on protection of children.

Phase three: transparency, user empowerment, and other duties on categorised services

A small proportion of regulated services will be designated Category 1, 2A or 2B services if they meet certain thresholds set out in secondary legislation to be made by the government. The final stage of implementation focuses on additional requirements that fall only on these categorised services. Those requirements include duties to:

  • produce transparency reports;
  • provide user empowerment tools;
  • operate in line with terms of service;
  • protect certain types of journalistic content; and
  • prevent fraudulent advertising.

Ofcom will issue a call for evidence on these duties in early 2024 and a consultation on draft transparency guidance in mid-2024.

Ofcom must produce a register of categorised services. It will advise the UK government on the thresholds for these categories in early 2024, and the government will then make secondary legislation on categorisation, which is currently expected by summer 2024. Assuming this is achieved, Ofcom will:

  • publish the register of categorised services by the end of 2024;
  • publish draft proposals regarding the additional duties on these services in early 2025; and
  • issue transparency notices in mid-2025.