Government says “The online harms regime will improve users’ safety online, build public trust in digital services, support innovation and drive digital and economic growth”.
The UK government has published its long-awaited full response on its Online Harms White Paper.
The government plans to publish an Online Safety Bill and appoint Ofcom as regulator. Online harms will form part of the government’s digital strategy, including setting up a new Digital Markets Unit at the CMA.
Scope of new rules
The new regulatory framework will apply to companies whose services: host user-generated content which can be accessed by users in the UK; and/or facilitate public or private online interaction between service users, one or more of whom is in the UK.
It will also apply to search engines. It will apply to any in-scope company regardless of where it is based in the world. There are various exemptions, including business-to-business services. Services which play a functional role in enabling online activity, such as internet service providers, will also be exempt from the duty of care, although they will have duties to cooperate with Ofcom on business disruption measures. Further exemptions will include services used internally by businesses, and many low-risk businesses with limited functionality (for example retailers who offer only product and service reviews). Finally, the government says that the proposed legislation will include robust protections for journalistic content shared on in-scope services.
What is harm?
The legislation will set out a general definition of harmful content and activity.
Duty of care and the principles of the regulatory framework
Companies in scope will have a duty of care towards their users and will have to prevent the proliferation of illegal content and activity online, and ensure that children who use their services are not exposed to harmful content. To meet the duty of care, they will need to understand the risk of harm to individuals on their services and put in place appropriate systems and processes to improve user safety. Ofcom will oversee and enforce companies’ compliance with the duty of care, acting in line with a set of guiding principles, including improving user safety, protecting children and ensuring proportionality.
The regulatory framework will establish differentiated expectations on companies in scope with regard to different categories of content and activity on their services: that which is illegal; that which is harmful to children; and that which is legal when accessed by adults but which may be harmful to them. The new regulatory framework will take a tiered approach. Most services will be ‘Category 2 services’. These companies will need to take proportionate steps to address relevant illegal content and activity, and to protect children. A small group of high-risk, high-reach services will be designated as ‘Category 1 services’, and they will need to deal with content or activity on their services which is legal but harmful to adults.
The regulatory framework will apply to public communication channels and services where users expect a greater degree of privacy – for example, online instant messaging services and closed social media groups. Relevant measures might include making services safer by design, such as limiting the ability for anonymous adults to contact children. Companies in scope will need to consider the impact on users’ privacy and ensure users understand how company systems and processes affect user privacy.
Ofcom will issue codes of practice which outline the systems and processes that companies need to adopt to fulfil their duty of care. Companies will need to comply with the codes, or be able to demonstrate that an alternative approach is equally effective. The government will set objectives for the codes in legislation. Ofcom will have a duty to consult on the codes, and must help all companies to understand and fulfil their responsibilities. Ofcom must also publish an economic impact assessment for each code and will have a specific duty to assess the impact of its proposals on small and micro businesses, to avoid undue regulatory burdens.
Interim codes of practice
The government has published interim codes of practice. They provide guidance for companies to help them understand how to mitigate the risks from online terrorist content and activity and child sexual exploitation and abuse. The interim codes and all the principles contained within them are voluntary and non-binding. Companies should consider factors such as the nature of their services, the underlying architecture of their systems, the risks to their users, and the availability of established or emerging technologies appropriate for addressing the issues identified.
Additional duties on companies
All companies in scope will have a number of additional duties beyond the core duty of care. These include providing mechanisms to allow users to report harmful content or activity and to appeal the takedown of their content.
Disinformation and misinformation
Disinformation and misinformation that could cause significant harm to an individual will be within scope of the duty of care. In addition to the requirements under the duty of care, the legislation will introduce further provisions to address the evolving threat of disinformation and misinformation. This will include specific transparency requirements and the establishment of an expert working group, targeted at building understanding and driving action to tackle these issues.
Regulation and enforcement
As mentioned above, Ofcom will become the regulator. It will cover the costs of running the regime from industry fees. Companies above a threshold based on global annual revenue will be required to notify and pay fees. Ofcom’s primary duty will be to improve the safety of users of online services (and that of non-users who may be directly affected by others’ use of them). It will have “robust” enforcement tools to tackle non-compliance, including the power to issue fines of up to £18 million or 10% of global annual turnover, whichever is the higher. The government will also consider criminal measures.
The government plans to develop a safety by design framework that will provide guidance for industry on how to build safer online products and services from the outset.
The Online Safety Bill will be published in 2021. The government also expects the Law Commission to produce recommendations concerning the reform of the criminal offences relating to harmful online communications in early 2021. The government will consider, where appropriate, implementing the Law Commission’s final recommendations through the Online Safety Bill.