Online Pornography, Age Verification and the Digital Economy Bill 2016

July 4, 2016

The regulation — or, at least, attempted regulation — of online pornography, and initiatives under the banner of protecting children online, is nothing new. Over recent years, we have seen legislative interventions (including the regulation of ‘extreme pornography’ under the Criminal Justice and Immigration Act 2008 and the Audiovisual Media Services Regulations 2014), co-regulatory approaches such as the Internet Watch Foundation’s ‘CAIC’ list and self-regulatory measures (albeit underpinned in some cases by not-inconsiderable political pressure) by Internet access providers to implement ‘family friendly’ filters.

Part 3 of the Digital Economy Bill 2016 is, as currently drafted, a further attempt in this area, setting out a framework to require operators of commercial online pornography services to implement age verification.

The general obligation

The main thrust, if I can put it that way, of this part of the Bill is that a person who ‘makes pornographic material available on the internet on a commercial basis to persons in the United Kingdom’ is prohibited from doing so ‘except in a way that secures that, at any given time, the material is not normally accessible by persons under the age of 18.’ In other words, the service must have at least reasonably robust age verification mechanisms, or else not provide content to UK users.

The framework is to cover both websites and ‘other means of accessing the internet’ — a slightly nebulous term, which is likely to mean ‘apps’, given that applications are expressly identified in the government’s response to its February 2016 consultation on age verification.

Internet access providers are likely to feel left in an uncertain position at the moment as, while the Bill does not reference them in this context, the definition of ‘makes pornographic material available’ could be argued as incorporating companies which provide connectivity to servers used for the making available of pornographic material. They may be able to take some solace from paragraph 22 of the Explanatory Notes, which refers to ‘commercial providers of pornography’, and so appears to place an emphasis on the content provision, but the optimal approach would be to improve the drafting in this section to make the legislative intent clear.

‘Operating on a commercial basis’

There is no definition within the Bill of ‘on a commercial basis’, although the provision of pornography ‘free of charge’, on a service which is otherwise ‘operated on a commercial basis’ is expressly stated to fall within the rules (clause 15(2)).

This approach would appear to be consistent with that taken by the CJEU in considering the scope of the ‘hosting’ shield for the purposes of Directive 2000/31/EC, in respect of which it has held that a site’s economic context need not relate solely to revenue flowing from the site’s visitors, but can also include revenue from a site’s advertising deals (Papasavvas). Similarly, in the recent decision of McFadden, the CJEU held that making available a free wi-fi service was a ‘service normally provided for remuneration’ where it is an activity performed by the service provider for the purposes of advertising the goods sold or services supplied by that service provider: the Court looked broadly at the purpose of the wi-fi service provision, rather than narrowly at the specific service itself.

A site which provided pornography free of charge and without any third-party advertising revenue may, on such a basis, still be considered to be ‘operating on a commercial basis’ if the site is operated with the intention of ‘upgrading’ users to the operator’s premium, for-charge, content.

Further, a broad degree of discretion is to be afforded to the regulator, which will seemingly be empowered to decide when a site is, or is not, complying with the rules, and whether or not it is operating on a commercial basis (clause 15(3)). While affording a regulator these powers may allow for a nimbleness of movement which a statute may struggle to provide, it also entails giving an appointed body potentially considerable power over a business’s ability to operate.

Definition of ‘pornographic material’

The definition of ‘pornographic material’ within the Bill takes up the best part of half a page; considerably more than the definition used in the Criminal Justice and Immigration Act 2008, which took a little over a line.

The gist of the definition is that ‘pornographic material’ encompasses all video works, or excerpts of video works, which are, or would be, ‘R18’ material, or would have led to an 18 certificate (clause 16).

On first glance, there would appear to be a lacuna in this as it would seem to exclude material which would not be eligible for R18 but which would not fall within offences of obscenity or extreme pornography — this may include, for example, a portrayal of apparently non-consensual activity. As this type of activity is likely to be exactly that in respect of which measures of this nature are considered necessary, this would appear to be an unfortunate oversight.

Age verification and proof of identity

The Bill does not specify what would constitute suitable age-verification measures, but the government’s consultation response is clear that a ‘tick-box’ or ‘enter your date of birth’ verification will not be deemed sufficient.

As such, some form of third-party validation or data sharing is likely to be compulsory, and it will be interesting to see whether this can be achieved without requiring users to identify themselves, in terms of providing their real world identity, to site operators.

If the only mechanism by which a site can adduce appropriate age verification mechanisms would be to require proof of identity, this is likely to have a significant adverse impact on most sites’ business models, as I would be surprised if many visitors would be willing to provide this information. I suspect that many users value their apparent (if not actual) anonymity when visiting a pornography site. If this is indeed the effect, and if it is not imposed and enforced across all providers, domestic and foreign, UK-based providers are likely to be at a considerable disadvantage.

Overseen by a new regulator

To oversee this new framework, the government is proposing to create a new regulator, the ‘age-verification regulator’ (clause 17). This new regulator would be imbued with information gathering powers (clause 19), as well as enforcement powers (clauses 20 and 21).

Enforcement powers include both enforcement notices (although these appear to be entirely optional, and not a prerequisite of issuing a financial sanction (clause 20(4)(a)), and financial sanctions. These financial sanctions have a maximum penalty of the greater of £250,000, or 5% of qualifying turnover (clause 21).

Enforcement is undertaken on a civil, not criminal basis. Although my gut reaction is that this is disappointing, since it entails a lesser burden of proof and it affords a substantial degree of power to a non-judicial body, it is perhaps to the general benefit of those against which sanctions might be levied to face a merely financial, rather than also criminal, risk.

Payment providers, ISPs and others

There is no power within the Bill to require an Internet access provider to block access to a service which fails to comply with the age-verification obligations. According to the Explanatory Notes, at paragraph 23, such power is not required ‘on the basis that this would not be consistent with the treatment of other harmful or illegal content such as online terrorist material’. The Notes also appear to cite with approval existing, co-regulatory approaches, such as deployment by major ISPs of ‘optional family friendly filters’.

In omitting such a power, it would seem that the government is, perhaps wisely, able to avoid an ever greater debate about censorship.

The Bill does, however, empower the age-verification regulator to provide to payment services providers and ‘ancillary service providers’, such as Internet access providers, a list of sites which it considers fail to have appropriate age verification measures in place. Although there is no legal compulsion to do anything, the clear intention of the measure is to encourage such providers to take voluntary action to update their blocking lists, or deny service. This approach is consistent with that of ‘Operation Creative’, in respect of online material which allegedly infringes copyright, which sees URLs of sites purportedly making available such materials being provided to intermediary service providers, who can voluntarily withdraw their services.

Net neutrality and the blocking of porn sites

The government’s seeming approval of the voluntary ‘family friendly filtering’ approach adopted by many Internet access providers is interesting, particularly against the backdrop of the open Internet directive (2015/201/EC).

It is questionable whether a self-regulated non-statutory approach to content blocking is consistent with this directive’s requirements, and whether the imposition of network-level blocks on categories of content (including pornography) in the absence of a statutory framework leads an Internet access provider to breach Article 3.

Article 3 provides that ‘[e]nd-users shall have the right to access and distribute information and content… irrespective of the … content’, and this is backed up with an obligation on providers of internet access services to ‘treat all traffic equally, … without discrimination, restriction or interference, and irrespective of … the content accessed or distributed.’

Derogations from this obligation are permitted only in certain circumstances, including where such measures are implemented to ‘comply with Union legislative acts, or national legislation that complies with Union law’. Currently, there is no such legislation in respect of pornography, and other stated derogations are not readily applicable.

Providers operating family filters are likely to argue that the directive is clearly about giving an end-user choice, and that choice as to what they wish to receive must inherently include choice as to what they do not wish to receive, such that a user who chooses to ask their provider to implement network-level blocking is merely exercising their choices and the access provider giving them technical effect. However, the idea that a customer might wish to apply network-level blocks to their connection is not something which appears expressly in the directive. This may well be an issue left to the courts to resolve, or else entail the hasty introduction of legislation.

Overseas effect and general effectiveness?

The elephant in the room with this legislation — and any legislation of a similar character — is the extent to which it will apply to providers based outside the UK and, more broadly, whether it will have much practical effect, particularly without the fall-back mechanism of compulsory site blocking.

Interestingly, the government’s consultation response suggests that at least two pornography companies are in favour of the age-verification scheme. The response cites ‘MindGeek, the parent company of several major sites including PornHub’ as supporting age verification. However, it is unclear that its sites currently implement any form of age-verification on a voluntary basis and, with this in mind, it is possible that, in addition to an undoubted genuine concern for the welfare of underage potential viewers who might stumble across their properties, commercial considerations may be involved: it may be the case that the resources required to implement age verifications measures are beyond the reach of smaller providers, and so might confer a commercial advantage on more established operators.

Both companies stressed the need for ISP blocking of infringing sites: a ‘glaring omission’ which would undermine the current proposal. Conversely, Internet access provider respondents to the consultation were less enamoured with the idea of compulsory blocking, with concerns of the efficacy of blocking, as well as its ‘complex and legally challenging’ implications.

Of course, at this stage, the proposals are just in bill form, and there is still opportunity for movement. While concerns around legitimacy and legal complexity of blocking are likely to be entirely justified, it is perhaps understandable that UK-based pornography providers are legitimately fearful of the possibility that they will be regulated in one manner — a manner which could well lead to a substantial decrease in traffic — and overseas competitors in another.

Neil Brown is the managing director of decoded:Legal (, a specialist English law firm advising Internet, telecoms, technology and healthcare start-ups and businesses.