Online Harms White Paper – SCL Consultation Response

July 18, 2019

Who are the SCL?

The Society for Computers and Law is a registered educational charity with over 1,500 members extending across the globe. Originally established in 1973 to promote the use and understanding of Information Technology in the context of law, its membership now represents all areas of IT law and includes those working in private practice, in-house, in chambers and academia from student and trainees to senior partners, consultants, QCs and judges.

Our mission is to inform and educate legal and technology professionals, academics and students and the wider audience on the impact of IT on law and legal practice through the promotion of best practice, thought leadership, and the fostering of a global tech law community.

SCL therefore feels that it is uniquely well-placed to offer a response to the Government’s White Paper.

Summary of our SCL’s response

SCL’s response can be summed up as this: 

If there is to be a new regulator of online content, then there needs to be both clarity and certainty. 

  • Clarity of expectations and definition of what can/cannot be classified as an online harm. 
  • Certainty in how the law and interventions of the regulator are to be applied. 

Fundamental rights to conduct a business, human rights such as freedom of expression, privacy, and informational autonomy must be key to any new law and/or powers of any regulator, and/or the resultant code of practice.

Ultimately, common sense and reasonable measure should be applied. Providers of online content should not be burdened with content moderation that in effect becomes online surveillance through commercial entities. Users of online content should expect to be able to communicate and express views without limitation, except to the extent that it is unlawful, without fear of disproportionate interventions in private communications. Users should be able to find appropriate redress, simply and easily without being overly burdensome. The business of providers of online content should only be disrupted where there is sufficiently serious cause.

Q1. This Government has committed to annual transparency reporting. Beyond the measures set out in this White Paper, should the Government do more to build a culture of transparency, trust and accountability across industry and, if so, what?



Dealing first with transparency, given the proposed breadth of application of the reporting regime, and the variation in the type of business carried on by potentially regulated companies, considerable variation in reporting practice seems likely. We therefore suggest that the Government considers introducing a standardised reporting format.

Standardised reporting, published under an Open Government Licence (v3) with browsable reports and summaries (similar to Google Transparency Reports), could make it easier for Government and the third sector to interrogate the raw data. Reporting on a quarterly basis would promote consistent and timely monitoring of the data gathered.


Turning to accountability, we suggest publishing details of enforcement action taken by the regulator or jointly by the regulator and another public body. Published information could include the type of service provider, type of content, action taken by/on behalf of the regulator, reason for removal/action, and response.

Accountability would further be supported by an ownership register for websites, which should include beneficial ownership and directing minds. The register need not be publicly available but should be available for investigations and enforcement action. There must be no anonymity within the regulatory regime. Standards of anti-money laundering KYC and full knowledge of who is doing what on the Internet, across borders, is essential.


Trust could be enhanced by supplementing the proposed code of practice with ethical duties, like the Bar Standards Board Code of Conduct, and/or by encouraging companies to adopt and publish an ethical code of practice. This would need to be complemented with governance which was authoritative and transparent which would oversee the outworking of the code, such as through ethical advisory boards.

The Government should also promote the use of age appropriate ratings or age-based user restrictions online.

Q2. Should designated bodies be able to bring ‘super complaints’ to the regulator in specific and clearly evidenced circumstances? If your answer is ‘yes’, please tell us in what circumstances this should happen?


The super complaints process could serve the following useful functions:

  • Accelerating urgent suspected breaches, for example, an urgent threat to life or of serious harm.
  • Reporting multiple and widespread suspected breaches, for example, widespread bullying or abuse of a class or group. This would support the Government’s transparency objective by enabling group complaints against powerful companies.

The process could also furnish the regulator with enhanced investigatory or enforcement powers, which may be helpful in cases of serious or persistent breaches. For example, an interim suspension of the website while the complaint is investigated may be onerous in the case of ordinary complaints but appropriate to the super complaint process.

Q3. What, if any, other measures should the Government consider for users who wish to raise concerns about specific pieces of harmful content or activity, and/or breaches of the duty of care?

Given the potentially high volume of complaints, the Government could consider an alternative mode of dispute resolution. This might include:

  • Clear signposting to ensure that the complaint is brought to the correct place. With the addition of a new regulator, it may be difficult in some circumstances to discern whether a complaint should go to advertising standards, the CMA, the Information Commissioner or a new regulator.
  • Clarity over whether the duty of care is civil or criminal, and the remedies available.
  • Availability of equitable remedies, not merely damages.
  • Opt-out collective action procedures, or the ability to bring actions for remedies other than damages based on data about the number of people affected rather than the number of claimants.
  • A mandatory grievance resolution procedure (e.g. withdrawal of content, apology, identification of user which generated the content – the latter point would require further consideration from a GDPR perspective).
  • Different procedures for interim remedies, for example, an urgent ex parte assessment of whether to remove content, suspend websites, or grant other injunctions in every case; and abolishing the general requirement to give a cross- undertaking in damages when seeking an interim injunction.
  • Clear rules about vexatious claims.
  • A dedicated court process modelled on IPEC small claims track, and/or an arbitration procedure like the ICANN Uniform Domain Name Dispute Resolution Policy (UDRP) where parties are not required to be physically present.

In order for the above to be effectuated, the Government should oblige companies subject to the regulatory regime to designate a place of business within the UK to establish jurisdiction for civil and criminal investigation and enforcement.

Q4. What role should Parliament play in scrutinising the work of the regulator, including the development of codes of practice?

The regulator should be an agency of a Government department, and therefore subject to scrutiny by a Select Committee. The Committee should receive regular reports from the regulator, and jointly with the responsible Government department, should undertake a review of the regulator and the regime’s efficacy some years after the legislation introducing the regulatory regime is passed.

Any changes to the regime or to the regulator’s role or powers should be implemented by affirmative statutory instrument.

Q5. Are proposals for the online platforms and services in scope of the regulatory framework a suitable basis for an effective and proportionate approach?



Dealing first with efficacy, SCL welcomes the Government’s ambition to establish a comprehensive regulatory regime, but we are concerned that the multi-sectoral approach may undermine the regime’s efficacy by reducing regulation to the lowest common denominator.

Powers for the regulator to intervene in a sufficiently serious case on an interim basis while investigations/proceedings are ongoing, would support efficacy. Immediacy of regulatory action is likely to be critical to public confidence, for example, in revenge porn, deep fake and CSEA cases. Careful thought should be given to the threshold of seriousness at which these powers become available, and the ability of the company to challenge or appeal interim measures.

We do not understand why the proposed regime’s scope is limited to companies. Any legal person (whether a company, a partnership, a private individual, an unincorporated association or any other person) must be subject to the regime if it is to be effective.

Turning to proportionality and drawing on our above noted concerns about the proposed breadth of the multi-sectoral approach, the regime’s focus on ‘harmful’ content lacks definition. It has the potential to significantly infringe civil liberties in a manner which is disproportionate. In this way, the underlying legislation leaves itself open to being narrowly construed by the courts, actually making it less effective (the lowest common denominator risk) rather than, as the Government intends, a catch-all regulatory regime.

Q6. In developing a definition for private communications, what criteria should be considered?

Since private communication cannot be defined by reference to the platform or channel within which the communication is sent, the Government should adopt a functional definition of private communications, for example, “whether in all the circumstances a reasonable person would justifiably have an expectation of privacy in the relevant communication.”

Given the pace of technological development, we suggest that the Government does not attempt to enumerate relevant circumstances in legislation but leave it instead to the courts. If the Government were to legislate for relevant circumstances/considerations, these should not be exclusive. They could include:

  • What effect will this have on the individuals involved in the communication?
  • What effect will this have on the business involved?
  • Will releasing the information prejudice any legal proceedings or regulator investigation
  • Does the communication contain anything that could put an individual a risk of serious harm, or is it of a national security nature?
  • Is the information the kind which one would normally expect to be kept private? For example, a communication between domestic partners about domestic matters?
  • Does the communication provide information about a minor which is not otherwise in the public domain?
  • Does the communication provide information about a protected characteristic and that information is not already in the public domain?

It would be helpful if the Government could consider whether an interest in private communications is capable of being ‘lost’ and, if so, how. For example, if a private conversation or image is reported in national or foreign media, can a right to privacy in that material be asserted to prevent further publication, or does the Government consider that privacy rights are unenforceable “after the horse has bolted”? In our view, an enduring right to privacy in communications and images will be an important tool to help victims of online harm, for example, revenge porn, regain a sense of control and empowerment, thereby helping them come to terms with the incident.

Q7. Which channels or forums that can be considered private should be in scope of the regulatory framework?

Any communications between individuals or private groups regardless of the medium, including forums which are only accessible by registration and invitation, should be considered private.

The Government should also consider extending the scope of the regulatory framework to cover unilateral private communication which would likely cause harm if published to another person, for example, creating a private Pinterest board containing abusive material. Regard should be had to proportionality, however. It may be that – analogously with the rule in Rylands v Fletcher in respect of dangers created on one’s own property – where a person creates a potentially harmful unilateral private communication online, he is strictly liable for any harm caused when that communication is published more widely.

Q7a. What specific requirements might be appropriate to apply to private channels and forums in order to tackle online harms?

The most effective measures would be focused on prevention, and might include:

  • Clarity that the operator of the channel or forum is not to be regarded as a primary publisher.
  • The need to adopt, publish and enforce acceptable use/community guidelines and a content moderation policy.
  • Suspension or termination of the accounts of users found to have repeatedly breached such a policy.
  • A limited time period for the user and/or the platform to remove infringing material or challenge the instruction to remove.
  • Strict liability for a platform’s failure to remove infringing content.
  • Reserve powers for the regulator to intervene and order content to be removed, including the power to enter the premises of the user or platform operator.
  • Non-liability for the forum operator conditional on taking responsive action to unlawful material.
  • Clarity that harmful and lawful material should not be removed unless it clearly violates the policy, providing reasoned examples and guidance of the examples of this kind of content).

Public education will also be an important component of the new regime:

  • Awareness of rights under the new regime.
  • Awareness of how to enforce those rights by complaining/bringing legal action for breach of duty of care.
  • Awareness of user responsibly to use services appropriately and to report irresponsible use.

Q8. What further steps could be taken to ensure the regulator will act in a targeted and proportionate manner?

The regulator should work closely with other regulators and agencies which focus on similar issues, such as the ICO, to enhance effectiveness, limit duplication and reduce the burden on tech businesses. This is especially appropriate where there is a multi-sectoral element to a given case. The new regulator should (so far as is possible) help the user not have to “forum- hop” or bounce from regulator to regulator to get the necessary redress.

Q9. What, if any, advice or support could the regulator provide to businesses, particularly start-ups and SMEs, comply with the regulatory framework?

Model policies and terms would reduce the burden on small companies without in-house legal teams.

Guidance documents providing examples of unlawful content are also likely be helpful, as are training sessions, webinars and workshops, and a confidential advice hotline (which could be funded by a levy on regulated companies).

The Government may find it helpful to look to regulation of the English Bar as an example, which provides all of the above through the BSB and Bar Council.

Q10. Should an online harms regulator be a new public body or an existing public body? If you think it should be an existing public body; which body or bodies should it be?

If it were formed from an existing public body, the Information Commissioner’s Office might be a suitable home for it.

Q11. A new or existing regulator is intended to be cost neutral: on what basis should any funding contributions from industry be determined?

The regulator should receive Government funding for core costs such as facilities, and for quasi-criminal investigations or inquiries.

Its other activities could be funded partly by industry and partly by monetary penalties levied for regulatory infringements. Industry funding, which would be more predictable than infringement-related funding, could be in the form of a levy on online advertising revenue in the UK or a mandatory annual registration fee for persons subject to the regulatory regime. The fee could be graduated depending on the number of UK users.

Q12. Should the regulator be empowered to:

  • Disrupt business activities
  • Undertake ISP blocking
  • Implement a regime for senior management liability 

Yes to all of the above.

Q12a. What, if any, further powers should be available to the regulator?

The regulator should have the power to bring proceedings (including to apply for injunctions), but it should also be possible for individuals and (crucially) civil society groups to bring proceedings on behalf of complainants, if the regulator fails to take timely action.

  • Rights of inspection and entry.
  • Powers to order document production.
  • Powers to compel attendance at public hearings.
  • Power to impose penalties for regulatory infringement/breach, including “name and shame” publicity, civil monetary penalties, refusal of permission to run a type of website, and criminal references to other relevant authorities.

Further consideration should be given to the territorial reach of the regulator’s powers.

Q13. Should the regulator have the power to require a company based outside the UK and EEA to appoint a nominated representative in the UK or EEA in certain circumstances?


This would enable easier communications with larger technology companies who are based, for example, in the US.

Q14. In addition to judicial review should there be a statutory mechanism for companies to appeal against a decision of the regulator, as exists in relation to Ofcom under sections 192- 196 of the Communications Act 2003?


Q14a. If your answer to question 14 is ‘yes’, in what circumstances should companies be able to use this statutory mechanism?

Rather than duplicating the basis on which the regulator’s decision is amenable to judicial review, the statutory mechanism should enable a full merit-based consideration of the claim (not merely review). It should also provide an urgent informal mechanism for companies to challenge interim actions taken by the regulator, such as suspension of a website, which could exist alongside appeal to the courts.

Q14b. If your answer to question 14 is ‘yes’, should the appeal be decided on the basis of the principles that would be applied on an application for judicial review or on the merits of the case?

Please see above.

Q15. What are the greatest opportunities and barriers for (i) innovation and (ii) adoption of safety technologies by UK organisations, and what role should Government play in addressing these?

It would be helpful if the Government were to provide more guidance on online safety, and a more coherent system of signposting where organisations should go to seek assistance.

Q16. What, if any, are the most significant areas in which organisations need practical guidance to build products that are safe by design?

Data privacy and data governance.

Q17. Should the Government be doing more to help people manage their own and their children’s online safety and, if so, what?


The Government should do more to educate children about online safety and responsibility, beginning in primary school, and the general public (not only parents). There needs to be greater awareness about user rights online and how to report abuse. This could be contributed to by requiring internet browser companies to display a homepage of information about user safety online.

It is also important that the regulator is properly resourced. The Government should not seek by these measures to transfer a public protection burden properly borne by the police onto an inadequately resourced regulator funded by industry. Such a course would likely undermine public confidence in internet governance and safety.

Q18. What, if any, role should the regulator have in relation to education and awareness activity?

The regulator should ensure that public education forms part of the mandatory code of conduct for regulated organisations and should provide resources and guidance to support organisations in meeting this requirement under the code.

It would also be possible for the regulator to adopt a consultative role vis a vis other Government departments and agencies, for example, supporting and advising on the maintenance of existing information sources, public campaigns and school curriculum development.

If you would like to discuss any aspect of SCL’s response above further, please do not hesitate to contact us.

SCL key contacts regarding this consultation are:

Caroline Gould
Society for Computers and Law Unit 4.5
Paintworks Arnos Vale
Bristol BS4 3EH, UK.
T: +44 (0)117 904 1242

Patricia Shaw
Society for Computers and Law c/o Unit 4.5
Paintworks Arnos Vale
Bristol BS4 3EH, UK. 
T: +44 (0)117 904 1242