Ofcom publishes report on video sharing platforms’ terms of use

August 10, 2023

In November 2020, Ofcom was appointed as the regulator for video-sharing platforms established in the UK. The Communications Act 2003 lists measures that VSP providers must take to protect users from relevant harmful material and under-18s from restricted material.

Ofcom has now published a report on video sharing platforms which considers how easy it is for people to access use and understand the terms and conditions set by six platforms: BitChute, Brand New Tube, OnlyFans, Snapchat, TikTok and Twitch. It also scrutinises how those VSPs explain to users what content is and is not allowed on platforms and the consequences for breaking the rules, as well as the guidance and training given to staff tasked with moderating content and enforcement.

The report highlights that the terms and conditions of VSPs can take a long time to read and require advanced reading skills to understand. This degree of complexity means that they are unsuitable for many users, including children.

Ofcom looked at the length of the terms and found that at nearly 16,000 words, OnlyFans had the longest terms of service, which it estimates would take its adult users over an hour to read. This was followed by Twitch (27 minutes, 6,678 words), Snapchat (20 minutes, 4,903 words), TikTok (19 minutes, 4,773 words), Brand New Tube (10 minutes, 2,492 words) and BitChute (8 minutes, 2,017 words).

Ofcom calculated a “reading ease” score for each platform’s terms of service and said that nearly all of them were “difficult to read and best understood by high-school graduates”. Twitch’s terms were found to be the most difficult to read. TikTok was the only platform with terms of service that were likely to be understood by users without a high school or university education. However, Ofcom said that the reading level required was still higher than that of the youngest users permitted on the platform.

Ofcom’s report also found that some platforms use “click wrap” agreements, where platforms make acceptance of the terms implicit in the act of signing up. Users are not prompted or encouraged to access the terms and so may agree to them without actually opening or reading them.

The six platforms’ community guidelines – which usually set out the rules for using the service in more user-friendly language – were typically shorter than the terms of service, taking between four and 11 minutes to read. Snapchat had the shortest community guidelines, taking four minutes to read. However, the language used meant it had the poorest reading-ease score and would be likely to require a university-level education to understand.

Ofcom also identified several other areas where improvements are needed. In particular, it found that:

  • Users may not fully understand what content is and isn’t allowed on some VSPs. VSPs’ terms and conditions do include rules regarding harmful material that should be restricted for children, but several could be clearer about exceptions to these rules.
  • Users may be unlikely to fully understand what the consequences are of breaking VSPs’ rules. While TikTok and Twitch have dedicated pages providing detailed information on the penalties they impose for breaking their rules, other providers offer users little information on the actions moderators may take. Ofcom also found inconsistencies between what Brand New Tube’s terms and conditions for users say about different types of harmful content, and their internal guidance for moderators.
  • Content moderators do not always have sufficient internal guidance and training on how to enforce their terms and conditions. The quality of internal resources and training for moderators varies significantly between VSPs, and few provide specific guidance on what to do in a crisis situation.

The report also highlights many examples of industry good practice. These include:

  • Terms that list a wider range of content that might be considered harmful to children. TikTok’s, Snapchat’s and Twitch’s terms cover a broad range of different types of content that may cause harm to children.
  • Terms explain to users what happens when rules are broken: Twitch and TikTok both have external pages containing detailed information about their penalties, enforcements and banning policies.
  • Where VSP providers test the effectiveness of their guidance for moderators: Policy changes at TikTok are tested in a simulated testing environment. Snapchat analyses moderators’ performances to test the effectiveness of internal policies and guidance.

Ofcom says that it will continue to work with platforms and pointed out that its regulation of VSPs is important in informing its broader online safety regulatory approach under the Online Safety Bill, which it expects to receive Royal Assent later this year.

This was the first report in a series that it intends to publish during 2023, to include a report on VSPs’ approach to protecting children from harm.