Coran Darling looks at the recent guidance for video sharing platforms that fills a void left by delays to the Online Safety regime and ponders what comes next
Ofcom has recently issued a proposed draft guidance on how video-sharing platforms (VSPs) should behave so as to meet their regulatory requirements. Amongst other items, the guidance covers potential measures providers may use to protect their users from potentially harmful material. The proposal also contemplates what should be defined as ‘harmful material’ and what types of medial will fall under this category.
This article seeks to briefly summarise the consultation and proposed guidance. In doing so, it clarifies the context surrounding the proposal, what types of services fall within the category of VSPs, what proposals Ofcom has made for these platforms, and the next steps the regulator seeks to take in creating a safer environment to share and distribute content.
Why are Ofcom proposing this guidance?
In late 2020, changes to the Communications Act 2003 (the CA 2003) came into effect, implementing a number of regulatory requirements for UK-based VSPs and obligating Ofcom with ensuring their enforcement.
The main purpose of these amendments was to “protect users […] from harmful content” when engaging with VSPs. In doing so, the new regime sets out a range of measures that must be considered when providing video-sharing services to the general public. In particular, providers must set in place suitable measures to protect the general public from materials that are likely to incite violence or hatred and must ensure that measures must are in place to protect those under the age of 18 from content that may “impair their physical, mental, or moral development”. This included several specific requirements to protect users from potential harms deriving from advertisements on their platform.
However, how this was to be achieved was left in something of a grey area and therefore presented Ofcom with an opportunity to clarify the current position of regulation, and how regulators sought to assist VSPs in achieving compliance.
What are VSPs?
The concept of VSPs largely originates from the provisions of the European ‘Audiovisual Media Services Directive’ (the Directive) where attempts were made to more concretely define the rapidly growing market of services and platforms that offered the ability to host and share media content online. These provisions were effected into UK law by virtue of their transposition into Part 4B of the CA 2003, and therefore bridging a gap in regulation while the Government’s ‘Online Harms regime’ remains unimplemented.
VSPs are online services that allow users to upload, share, and play-back user-generated videos with other members of the platform. Well-known examples of these are the likes of YouTube and Vimeo. Users typically access these services through an application programming interface (API) in the form of a mobile app or website and videos are then generally accessible to its users until they are deleted by the user themselves or the VSP ceases to host the media. The ease of access and use of VSP hosted media means that they can be easily used as platforms for entertainment, learning, and business resources.
A notable defining aspect of VSPs from the definition provided in the Directive is that, unlike curated platforms, they do not hold editorial responsibility or the obligation to educate, inform, or explain details to users about the content they host. The responsibility of assessing how appropriate the content is rests with the users themselves. This understandably raises questions about protecting users from harmful online content, as the Government intends, without suitable moderation by the platform provider or regulator.
What is Ofcom proposing?
The proposal from Ofcom seeks to reflect the distinction in the CA 2003 between advertisements that are controlled by the VSP and those that are not. In other words, it seeks to distinguish between circumstances where a VSP has “specifically marketed, sold, or arranged” advertisements present on their platform and content, and when this is not the case.
In instances where the VSP does have control of these advertisements, VSPs are responsible for ensuring compliance with all relevant regulatory requirements. In such circumstances, Ofcom has proposed that regulation of these matters is to be done in tandem with the Advertising Standards Authority (the ASA).
Where the VSP has not substantially controlled advertisements, the VSP nevertheless must take the necessary steps to ensure that the advertisement on its hosted content meets all necessary requirements. In these circumstances, Ofcom has proposed that it will assess and determine whether the steps taken by the VSP have been sufficient to protect its users.
Ofcom have therefore requested consultation on five primary areas:
Where do we go from here?
In May of this year, the UK Government published the ‘Draft Online Safety Bill’ (the Online Safety Bill) with the aim of establishing a new regulatory framework capable of tackling harmful content online and achieving their goal of making the UK the safest place in the world to be online. When entered into force, the Bill will supersede many of the provisions of the CA 2003 and those provisions dedicated to advertising requirements will be repealed.
In publishing this Bill, the UK Government also stated that regulation of VSPs in the context of advertisements would continue to be the responsibility of the ASA and Ofcom. The Ofcom proposals have already factored in the current proposed provisions, including their obligation to regulate, and are designed to compliment the Bill once it comes into force. In doing so, they seek to foster a collaborative approach that will allow regulators and providers to work together to ensure that advertising standards are held in the best interests of the public and VSP users.
The consultation ends on 28 July 2021 with a summary of their findings due shortly after.
Does it go far enough?
The early stages of the consultation grant Ofcom the invaluable opportunity to take stock of the regulatory landscape and are an encouraging development towards the creation of a safer online environment for users in the UK. In particular, it does well to distinguish the responsibilities for advertisements based on whether control and curation of such media is in the hands of the VSP.
While it is acknowledged that this guidance will be superseded by the Online Safety Bill should it propose any further amendments prior to coming into force, the extent in which this is the case remains unclear. One area lacking clarity is the future scope of which platforms will be defined as VSPs and whether Ofcom’s cooperation with other digital regulators may result in foreign-based VSPs becoming subject to regulation by virtue of allowing access to UK users.
Answers to many of these areas of uncertainty will become clearer as the Draft Online Safety Bill and similar forms of regulation develop alongside the guidance of regulators such as Ofcom and the ASA. The current proposals therefore represent a promising next step in online regulation but continue to present further opportunity for development as the draft framework becomes legislation.
Until further consultation and development is made, it appears the best course of action may be to sit and wait for the situation to ‘buffer’ while determining in which direction the future of VSP regulations will progress.
Coran Darling is a Trainee Solicitor at DLA Piper LLP with experience in data privacy, artificial intelligence, robotics, and intellectual property, having joined the firm from the technology industry.