Is Europe Moving Away from Protecting Online Platforms?

October 8, 2017

The extent to which online platforms should be responsible
for limiting the availability of unlawful content has been a topic of recent
debate in politics and the media. Should new obligations be put on online
platforms and other internet intermediaries to try to limit the availability of
unlawful content online, and if so what should those obligations look like?

Setting the scene

First, a brief recap of the current regime in Europe.
Articles 12 to 14 of the E-commerce Directive contain protection from liability
for those acting as ‘mere conduits’, and those who are caching, or performing
hosting services. The most relevant for the present debate is the Article 14
hosting defence. This shields information society service providers (such as
ISPs, platforms, social media, etc) from liability for content stored at the
request of a user of the service as long as they do not have actual knowledge
of the illegal activity/information and are not aware of facts and
circumstances from which the illegal activity/information is apparent. If the
provider obtains such knowledge/awareness they are still protected as long as
they act ‘expeditiously’ to remove/disable access to the information (notice
and take down).

This goes hand in hand with Article 15 of the E-commerce
Directive, which prohibits general obligations being imposed on providers to
monitor the information transmitted/stored, or actively to seek facts or
circumstances indicating illegal activity. It is Article 15 which has cropped
up most in the controversy surrounding the Commission’s proposed new Copyright
Directive, published in September last year (on which, see below).

It is worth recapping here that the hosting defence covers
not just technical storage providers, but can also apply to sophisticated
platforms doing more than mere storage.

So is Europe moving
away from the current regime?

There are two reasons people are asking this question: case
law and recent EU legislative proposals and policy documents.

Case law

In 2015 the European Court of Human Rights gave a decision
in the Delfi case, which some
commentators took to undermine intermediary liability protection. In Delfi the operator of a news portal was
found liable by an Estonian court for reader comments posted under an article.
By the time the case got to the ECtHR, the issue was whether the operator’s
freedom of expression rights had been violated by such a finding. However, the
ECtHR did not examine the correctness of the Estonian court’s original finding
that the operator did not benefit from the hosting defence. (By contrast, a UK
case (Karim v Newsquest) has held
that the defence was available to a news website in respect of reader comments.)
Accordingly, although the case got some headlines at the time, the European
supra-national court was not itself making a finding that hosting protection
did not apply in such circumstances.    

A similar case came before the ECtHR in 2016: MTE and Index.hu v Hungary. Here, the
Hungarian Courts found a portal liable in respect of reader comments posted to
articles. Various bases were given for finding liability, but the highest
Hungarian court considered the portals were not intermediaries at all. Again,
the ECtHR did not go behind the domestic court’s finding on the hosting
defence, albeit this time it did find that there had been a violation of the
portal’s freedom of expression rights.

More recently, there have been two judgments issued by the Northern
Ireland High Court which drew attention for appearing to erode the hosting
defence. In CG v Facebook Ireland Ltd and J20 v Facebook
Ireland Ltd
, the Northern Ireland High Court found Facebook liable for
certain information displayed on user profile pages, as a misuse of private
information. However, the Court of Appeal (NICA’) subsequently overturned
various aspects of the High Court’s finding in CG. Significantly, the NICA disagreed with the argument that the
platform knew or ought to have known about the content without the need to be
notified simply because it was similar to content previously found to be
unlawful. This would have been possible only if Facebook had an obligation to
monitor user content, which is incompatible with Article 15 of the E-commerce
Directive. The Court of Appeal also reduced the level of damages awarded in J20.
It is not yet known to what extent the NICA analysed the hosting defence
arguments in that case, as only a summary of the judgment has been published at
the time of writing.

Proposed Copyright Directive

Following various consultations as part of its Digital
Single Market strategy, in September 2016 the Commission published a draft text
for a proposed new Copyright Directive. Article 13 of the draft Directive is in
part designed to try to address a so-called ‘value gap’, a name used by rights
holders and others for the perceived gap between the royalties generated for
creators/owners of copyright works and the revenue generated by websites making
use of those works. The gap is said to arise in particular due to the widespread
online presence of unlicensed content. Article 13 would oblige service providers
which store and provide access to large amounts of works uploaded by users to:
(1) take measures to ensure the functioning of agreements concluded with rights
holders for the use of their works, and (2) prevent the availability on their
services of works identified by rights holders through cooperation with the
service providers. An example given of such measures is effective content
recognition technology.

Both sides of the debate (rights holder vs intermediary)
have been vociferous in their commentary. Many observers have also pointed out
the lack of clarity around how this regime (specifically, part (2) above) is
supposed to fit with the prohibition on monitoring in Article 15 of the E-commerce
Directive, and other EU legal instruments.

At the time of writing, the position is that no consensus
has been reached yet among European legislators about the fate of draft Article
13. To become law, text would need to be approved by both the European
Parliament and the Council. Of the European Parliament committees tasked to
give Opinions, one declined, one is in progress and three have opined (in
different directions). The lead committee responsible, Legal Affairs (JURI) is
yet to report.

The European Council is also yet to adopt a common position.
Two sets of questions have been submitted to the Council’s inhouse lawyers
regarding Article 13, one jointly by six Member States and one by Germany.
These include questions about its compatibility with the EU Charter of
Fundamental Rights, with Article 15 of the E-commerce Directive, and with the
definition of ‘communication to the public’ under the InfoSoc directive. The Estonian
Presidency has meanwhile also made two alternative compromise proposals on Article
13.

There are suggestions that there will be some delay from the
original timetable, meaning that a plenary Parliamentary vote may not happen
until the end of 2017/early 2018.

New Guidelines

The most recent development has been the publication on 28
September 2017 by the Commission of a Communication about tackling illegal
content online. The thrust of the Communication is apparent from its sub-title ‘Towards an enhanced responsibility of online
platforms
’. According to its introduction, the Communication lays down a
set of guidelines and principles for online platforms to ‘step up the fight against
illegal content online’. From an IP perspective, content owners are likely to
welcome the move. But many of these guidelines and principles will generate
controversy (whether there is over-reliance on so-called trusted flaggers;
whether sufficient heed is paid to either variation between laws of different
EU countries about what content is illegal or to context, particularly when
considering automated filtering technology). For present purposes, I will focus
on proactive measures by online platforms.

According to the Commission, online platforms ‘should’ adopt
effective proactive measures to detect and remove illegal content online. It
considers that this does not automatically lead to the online platform losing
the benefit of the hosting defence. The Commission’s reasoning has two parts.
First, it likens the taking of such measures to the sorts of acts found in L’Oréal v eBay not to amount to playing
an ‘active role’ in respect of the content (storing offers for sale, setting
terms of service, being paid and providing general information to customers).
Second, it recognises that proactive measures ‘may result’ in that platform
obtaining knowledge/awareness leading to a potential loss of the hosting
defence, but notes that protection would not be lost if the platform takes
expeditious action to remove/disable access. This second part could be seen as reducing
the utility of the conviction in the immediately preceding argument, and effectively
narrows the question of liability to whether removal was expeditious in such
cases (assuming the content concerned is in fact illegal). Platforms will also point
out there is no real discussion of how this fits with the Article 15
prohibition on general monitoring duties. As a whole, the position puts much
greater emphasis on platforms.   

The Communication provides guidance and recognises it does
not change the legal framework or contain legally binding rules. However, it is
a tool for exerting political pressure and is put in strong terms (‘the
Commission expects online platforms to take swift action’). It is described as
a first step. The document tells us that exchanges and dialogues with online
platforms and others will continue and progress will be monitored to assess
whether ‘additional measures’ are needed. This includes the possibility of ‘legislative
measures to complement the existing regulatory framework’.

Conclusion

The combination of proposed Article 13 of the draft
Copyright Directive and the Commission’s latest Communication will lead some to
conclude that Europe is indeed moving away from protecting online platforms. It
certainly appears that the two developments would place a much greater onus on
platforms than is currently the case. A fuller picture will be known in May
2018, when the Commission says the work of ensuring ‘swift and proactive
detection and removal of illegal content online’ will be complete, and the
Copyright Directive will be in final form. But the direction in which European
policy makers are heading is already evident. 
   

Nick Aries is a
partner in Bird & Bird’s IP Group. Nick advises on copyright, trade mark
and design matters, with a particular focus on IP issues arising out of digital
business models / the media sector.