Ofcom consults on guidance for video-sharing platform providers on measures to protect users from harmful material

March 25, 2021

Video-sharing platforms (VSPs) are widely used by a broad range of UK internet users, with 97% of the UK online population accessing them in the last year. However, according to Ofcom research, in the last three months one in seven VSP users claim to have been exposed to potentially harmful experiences. Many users (60%) remain unaware of ways to stay safe on these platforms.

Ofcom has been given new powers to regulate UK-established VSPs. VSP regulation sets out to protect users of VSP services from specific types of harmful material in videos. This includes protecting under 18s from potentially harmful material and all users from material inciting violence or hatred, and content constituting criminal offences relating to terrorism; child sexual abuse material; and racism and xenophobia. VSPs are also required to ensure certain advertising standards are met.

The statutory framework in the Communications Act 2003 (as amended by the regulations made implementing the revised Audiovisual Media Services Directive) sets out a list of measures which providers must consider taking, as appropriate, to secure the required protections. These new requirements came into force on 1 November 2020.

The new framework requires providers to take appropriate measures to protect users from harmful material in videos. Schedule 15A of the Act sets out a list of the measures providers could take. Ofcom is required to draw up and consult on guidance for providers of video-sharing platforms concerning the measures in Schedule 15A which may be appropriate to protect users from the specified categories of harmful material, and the implementation of such measures. The measures are summarised as follows: 

  • measures relating to terms and conditions;
  • measures relating to the reporting, flagging or rating of content;
  • access control measures such as age assurance and parental controls;
  • complaints processes (including the requirement to provide for an impartial procedure for the resolution of disputes); and
  • media literacy tools and information.

Ofcom is consulting on draft guidance for providers on the regulatory requirements. This covers the measures set out in the statutory framework which may be appropriate for protecting users from harmful material, and how these might be implemented. It includes, among other things:

  • Having, and enforcing, terms and conditions for harmful material;
  • Having, and effectively implementing, flagging and reporting mechanisms; and
  • Applying appropriate age assurance measures to protect under 18s, including age verification for pornography

The guidance does not cover guidance on determining scope and jurisdiction. Ofcom consulted in November 2020 on separate guidance to help providers understand whether they fall within scope of the definition of a VSP under the Act, including whether they fall within UK jurisdiction, and the process and information required for notifying services to Ofcom. The final guidance on that point was published on 10 March 2020. 

In addition, the guidance does not cover advertising-specific requirements around transparency, prohibited and restricted products and other general advertising requirements, as well as the measures that directly relate to these requirements. Ofcom will consult separately in relation to these advertising-specific requirements, including proposals for guidance on the control of advertising and a proposal to designate VSP advertising functions to the ASA.

The consultation ends on 2 June 2021.