Ofcom calls on tech firms to start preparing for regulation now

July 6, 2022

As SCL members will be aware, the Online Safety Bill will introduce rules for sites and apps such as social media, search engines and messaging platforms, as well as other services that people use to share content online. Ofcom is the designated regulator and has published its plans for implementing the Bill. It expects the Online Safety Bill to pass by early 2023 at the latest, with its powers coming into force two months later.

Ofcom intends to prioritise certain areas in the first 100 days after the Bill comes into force: protecting users from illegal content harms, including child sexual exploitation and abuse, and terrorist content. It will set out:

  • a draft Code of Practice on illegal content harms explaining how services can comply with their duties to prevent them; and
  • draft guidance on how it expects services to assess the risk of individuals coming across illegal content on their services and associated harms.

Ofcom has also launched a call for evidence on these areas: the risk of harm from illegal content; the tools available to services to manage this risk; child access assessments; and transparency requirements. It seeks evidence to strengthen its understanding of the range of approaches and techniques platforms can employ to help them meet their proposed duties under the Online Safety Bill.

To help companies identify and understand the risks their users may face, Ofcom will also publish a sector-wide risk assessment. This will include risk profiles for different kinds of services that fall in scope of the regime. It will also consult on its draft enforcement guidelines, transparency reporting and record-keeping guidance. It plans to finalise them in spring 2024. Within three months, companies must have completed their risk assessments related to illegal content, and be ready to comply with their duties in this area from mid-2024 once the Code of Practice has been laid in Parliament. It says that it will amend its plans if the timing or substance of the Bill change.

Ofcom will also identify high-risk services for closer supervision. They must be ready – as soon as Ofcom’s first set of powers come into force in early 2023 – to explain their existing safety systems to Ofcom and how they plan to develop them.

Some elements of the online safety regime depend on secondary legislation – for example, the definition of priority content that is harmful to children, and priority content that is legal but harmful to adults. Ofcom’s duties in these areas will come into effect later and timings will be subject to change, depending on when secondary legislation passes. It will consult on draft Codes of Practice and guidance on these areas shortly after secondary legislation passes.

It says that it will build on work already underway by:

  • increasing its engagement with tech firms of all sizes;
  • publishing its first report on how video-sharing platforms such as TikTok, Snapchat, Twitch and OnlyFans are working to tackle harm;
  • undertaking and publishing research on the drivers and prevalence of some of the most serious online harms in scope of the Bill, as well as technical research on how these might be mitigated;
  • further developing its skills and operational capabilities; and
  • continuing to work with other regulators through the Digital Regulation Cooperation Forum to help ensure a joined-up approach between online safety and other regimes.

It points out that this is novel regulation and so it is also important to understand what the Online Safety Bill does – and does not – require. The focus of the Bill is not on Ofcom moderating individual pieces of content, but on the tech companies assessing risks of harm to their users and putting in place systems and processes to keep them safer online.

Ofcom will have powers to demand information from tech companies on how they deal with harms and to take enforcement action when they fail to comply with their duties.

Ofcom has also pointed out that:

  • Ofcom will not censor online content. The Bill does not give Ofcom powers to moderate or respond to individuals’ complaints about individual pieces of content.
  • Tech firms must minimise harm, within reason. Ofcom will examine whether companies are doing enough to protect their users from illegal content and content that is harmful to children, while recognising that no service in which users freely communicate and share content can be entirely risk-free.
  • Services can host content that is legal but harmful to adults. However, they must have clear service terms. Under the Bill, services with the highest reach – known as ‘Category 1 services’ – must assess risks associated with certain types of legal content that may be harmful to adults. They must have clear terms of service or community guidelines explaining how they handle such content, and apply these consistently. They must also provide tools that empower users to reduce their likelihood of encountering this content. However, they will not be required to block or remove legal content unless they choose to.

Separately, it has been reported in some media outlets that Ofcom is likely to receive more powers to deal with end-to-end encryption to prevent images of child sexual abuse being circulated online. The Home Secretary has said that tech giants will not be allowed to use end-to-end encryption for messages and sharing content as an excuse for failing to protect children from abuse.