Online harms - the good, the bad and the unclear

David Barker thinks the Government's initial consultation response to the Online Harms White Paper shows progress but still needs some work

When the government published its online harms white paper (OHWP) last spring Pinsent Masons explained the view that the ideas in the white paper should represent the start of a journey towards greater regulation, rather than the final destination of that journey; many technology companies are ready for greater regulation, but that a rushed or ill-judged approach could create more problems than it solves.

The government has now published an 'initial consultation response' - a combination of a report and findings from the responses received to the original OHWP consultation and indications as to how it will take the process forward. The government describes the response paper as an “iterative part of the policy development process”. This is a helpful recognition of the extent of further work required to ensure that regulation of online harms is targeted, proportionate and workable. To borrow and bend a phrase, the response paper feels like a combination of “the good, the bad and the unclear”.

The good

Let’s put our best foot forward and highlight some positives.

First, it’s notable that some 2,400 responses were received to the OHWP consultation request. These came from large tech businesses, SMEs, academics, think tanks, children’s charities, rights groups, publishers, governmental organisations and individuals. It’s reassuring to know that such a number and range of responses were received. Almost 400 of the responses were from organisations, the rest being from private individuals. The volume of responses is a good illustration of the importance of the topic and should bode well for ensuring a balanced legislative approach when that time comes.

Next, it’s striking that protecting freedom of expression is given a high level of prominence in the response paper. It is the very first topic in the government’s response section, and the phrase “freedom of expression” appears no less than 47 times in the response paper, compared with just nine in the original OHWP. Of course, analysing the response paper by word count alone has its limitations, but the prominence of freedom of expression bodes well for a more balanced approach when it comes to the nitty gritty of drafting legislation.

The response paper is less clear on the detail of how freedom of expression will be protected. However, one important and reassuring point which does emerge is the explicit acknowledgement that different approaches need to be taken to content which is unlawful and content which is lawful but potentially harmful. The obvious chilling effect of treating these two fundamentally different types of content in the same way was highlighted with considerable force in responses to the OHWP and it looks like common sense is now getting the upper hand.

It ought to go without saying that speech which is unlawful and speech which may be perceived as harmful by some are not even close to being the same thing. For an apt and very current illustration of the serious difficulties which arise when the boundary between the two is not recognised, it is worth taking time to read the judgment of Mr Justice Julian Knowles in Miller v The College of Policing & Another. In his first paragraph the judge quotes from Animal Farm: “If liberty means anything at all, it means the right to tell people what they do not want to hear.” Legislators could do worse than to treat this judgment as a useful case study on the risks of apparently well-intentioned over-reaching.

The next piece of welcome news is that the focus of the regulatory regime will be on systems and processes at an overarching level, rather than on more granular instances of non-compliance. There is explicit confirmation, for example, that the new regulatory framework will not require the removal of specific pieces of legal content. This recognises that technology providers have to deal with challenges at a very significant scale. By avoiding the temptation to focus on individual, fact-specific situations, technology providers and the regulator will have a greater prospect of tackling big-picture problems. It follows that the regulatory regime will not provide a redress mechanism for individual complainants in relation to their particular circumstances - more on this below.

Also welcome is the confirmation that the regulator’s decisions will be subject to a statutory appeals process through the courts, in much the same way as the findings of the UK Information Commissioner's Office are subject to appeal through the First Tier Information Rights Tribunal and Ofcom’s decisions, within its current remit, can be challenged in the Competition Appeals Tribunal. The alternative approach mooted in the OHWP was that the only mode of challenge might be judicial review. This was an unsatisfactory suggestion, given the high thresholds for successfully pursuing judicial review proceedings. The availability of an appeals procedure should give regulated entities more confidence in the regulator’s approach, knowing that it can be held to account if necessary.

Another positive point to highlight is the emphasis in the response paper on transparency. The whole area of online harms is susceptible to judgements being made based on a lack of information. Examples are in the areas of child sexual exploitation and abuse and terrorism. It would be helpful for the public to have a better idea of the lengths to which technology providers currently go in areas like these. The government has committed to producing its own transparency report in the next few months.

The bad

It’s not all good news. Prior to the publication of the OHWP last spring the idea that a 'duty of care' should be the new regulatory standard had gained quite a lot of traction. The Telegraph had campaigned hard for this, and the Carnegie UK Trust had championed this idea before the House of Lords Communications Committee. In the response paper there is a sense that the government may be starting to recognise problems with the idea that a duty of care should define the standard required of regulated entities.

The response paper states that the duty of care will “only apply to companies that facilitate the sharing of user generated content, for example through comments, forums or video sharing”. This is a step in the right direction, but one hopes that later in the legislative process the idea of a duty of care will be put to bed. Lawyers, and lawmakers, know that in our jurisdiction the concept of a 'duty of care' has developed incrementally through case law, with some ebbs and flows, but with an overarching incremental and logical sense of legal narrative. In the limited scenarios in which duties of care have been established by statute, that has happened in order to fill a vacuum, for example in relation to occupiers’ liability. Applying this badge as a kind of panacea to online harms frankly smacks of crowd-pleasing rather than careful, legally-informed thinking.

If anything, other indications in the response paper only serve to highlight how problematic continuing with the duty of care label will be. As already indicated, the response paper is clear that the regulator will not investigate or adjudicate on individual complaints. As a result, we currently face the prospect of a regulatory framework which will have the badge of a duty of care but which will leave individuals distinctly confused about what that duty of care means for them in terms of claiming redress. They will not be able to pursue a complaint to the regulator about their individual circumstances. They may be able to bring a claim through ordinary legal proceedings concerning what has happened to them, but that claim will need to be based on the existing framework of their legal rights. They will not be able to contend that they are entitled to remedies based on the new duty of care because the new regime will not establish a new private right of action in tort. How are judges supposed to navigate all of this?

Another confusing aspect of the response paper is the continuing lack of clarity around in-scope services. The scope originally proposed in the OHWP was that the legislation would “apply to companies that allow users to share or discover user-generated content, or interact with each other online.” The breadth of this was quite alarming. In the response paper the government said: “The legislation will only apply to companies that provide services or use functionality on their websites which facilitate the sharing of user generated content or user interactions, for example through comments, forums or video sharing.” However, the definition remains broad.

The government has assessed that "only a very small proportion of UK businesses (estimated to account to less than 5%) fit within that definition". If this was meant to be greeted as good news then it may well miss the mark. Assuming that “less than 5%” should be read as “something approaching 5%”, this still represents a very substantial number of UK businesses.

Getting the scope clear will be one of the biggest challenges for the government as it moves towards drafting legislation. For example, take the term 'website'. A common understanding of this term might be that it refers to “a page on the internet”. That’s a quite outdated notion. Where do apps, chat rooms and chat functionality in games fit in, for example?

Connected with this is the fundamental problem around private communications. The OHWP and the response paper both make the point that many illegal or harmful communications take place in private spaces. Most respondents to the OHWP consultation said that private communications should not be within scope of regulation at all. There is a strong sense that the government is not happy with this, but does not have a clear plan for addressing the very difficult question of how private communications could be effectively brought within scope in a way which is lawful. Our current best guess is that there may be some lighter touch regulation around providing appropriate functionality for users to protect themselves in a private environment. Clearly, though, there is a lot still to do in this area.

The unclear

Much remains unclear and there is hard work to do to make progress on the more difficult aspects of the online harms agenda.

It looks almost certain that Ofcom will be appointed as the regulator for online harms. Ofcom appears to have been the clear preference among those respondents who expressed a preference, and it is understandable that the government is reluctant to establish a brand new regulatory body in circumstances where the regulator’s task will not be straightforward. What is unclear, however, is just how effective Ofcom will be. It will be tempting for Ofcom to play to its strengths and to shape its role based on the strategies and tactics which it has learned from its current remit, especially in the context of broadcasting. That will only work to some extent, though, and Ofcom is likely to need to bring in new expertise and skills, particularly on the technology side, if it is to have credibility in its role.

The possibility of directors being held personally responsible for corporate failures to meet regulatory requirements was raised in the OHWP. The response paper is relatively light-touch on this, and states that responses are still under consideration and that the government will set out its policy position in the spring. A very careful approach will be needed here. Existing regimes involving personal responsibility of directors are generally based on the director having some involvement in the wrongdoing. It is not clear how this standard approach can or will be adopted in the context of online harms. With so much to grapple with already, it may be better to leave this challenge to one side for now, and to return to it later if the regulatory regime proves ineffective for whatever reason.

There are also some very complex issues still to work through concerning children and age-verification. Last autumn the government shelved plans to introduce age-verification systems for the access to pornography online under the Digital Economy Act. Attempts to introduce a system using credit card verification had proved controversial and problematic, and the government also failed to give appropriate notice to the European Commission of its plans under the Technical Standards and Regulations Directive.

Last autumn’s announcement said that age-verification would not be introduced under the Digital Economy Act, and that this would be taken up as part of online harms instead, as part of age-verification in the context of access for children to a wider range of services. There is a sense in which a lot of difficult things are being put in the same bucket here, and the protection of children online is perhaps the most difficult of all. In this context it will be important for legislators to think about the way in which children live now, and the way in which technology is an absolutely integral part of their lives. Research commissioned by the UK Safer Internet Centre shows that the internet plays an important role in young people's development and in forming their identity offline.

Finally, the OHWP itself said that the “new regulatory framework will increase the responsibility of online services in a way that is compatible with the EU’s E-Commerce Directive, which limits their liability for illegal content until they have knowledge of its existence, and have failed to remove it from their services in good time”. The response paper is notably silent on this point. Perhaps the UK’s departure from the EU may have meant it was off message to refer to the E-Commerce Directive safe harbours, and the Regulations bringing them into force in UK law. Ensuring that legislation takes proper account of the law on safe harbours will be critical. This issue cannot simply be ducked.

Overall, the response paper seems more grounded than the original OHWP. There is much less of a sense that big tech can simply be blamed for, and expected to fix, everything.

The most striking example of that type of thinking in the original OHWP was the example of content illegally uploaded from prisons - search for “Box 3” in the OHWP. Quite how this made it into the published version is baffling. The root of this problem is in prison perimeter security, or the lack of it, which means that devices are easily smuggled in. To blame big tech for this is far wide of the mark. In the response paper there is a more pragmatic and informed approach, no doubt as a result of the detailed consultation process. The response paper represents a positive step forward on a number of fronts, but still leaves a great deal of hard work to do, and the government will continue to need much support from external stakeholders.

Timings for legislation are unclear, but there are a number of interim steps which might continue to inform policy development. In addition to its own transparency report which is expected in the next few months, the government also plans to introduce interim codes of practice in some areas.


David Barker is a partner at Pinsent Masons and has more than 20 years' experience dealing with complex disputes. He leads Pinsent Masons' TMT Disputes Team. David was named in The Lawyer's "Hot 100" for 2019 for his ground-breaking work in data privacy litigation.

This article was first posted on the Pinsent Mason blog and is reproduce with permission

Published: 2020-04-06T13:00:00

    Please wait...