Vidal-Hall v Google: Can Big Brother Be Defeated?

April 12, 2015

In March, the Court of Appeal handed down a judgment in Vidal-Hall v Google [2015] EWCA Civ 311. Whilst this judgment is only the second in what looks likely to be a series of appeals before the substantive matter reaches trial, it sets clear signposts that could revolutionise the way online giants make their money.

Background

In recent years, targeted advertising has been the be-all and end-all of digital marketing. Everybody is at it, and Google is the largest provider of these services.

Google’s DoubleClick advertising platform, the subject of Vidal-Hall, reportedly holds as much as 77% of the market share, serving adverts on both Google and member owned websites. In summary, this is how it works:

1.      Website owners (known as ‘Publishers’) become ‘network members’ by signing up to Google AdSense or DoubleClick Ad Exchange, which allows them to earn advertising revenue without pursuing advertising clients (‘Advertisers’) directly; and

2.      Advertisers acquire advertising space on either of Google websites, Publishers’ websites, or on both, through Google AdWords.

Case Background

The Background Facts

The three claimants were all individuals that owned Apple computers using the Safari web browser between summer 2011 and 17 February 2012.

Safari automatically blocks all third-party tracking cookies. Between the aforementioned dates, Google DoubleClick either knowingly or recklessly used a workaround to avoid this block, despite representing on their website that Safari users need not take any further steps to block DoubleClick from tracking their usage.

The claimants alleged that this revealed private information about themselves to other users of the computer, and to onlookers, such as someone peering over their shoulders at their screen.

They therefore brought a claim for misuse of private information, and for breach of the DPA. They claim damages for anxiety and distress, and for aggravated damages for Google either i) not being aware of the issue when they should have been, or ii) being aware of the issue and doing nothing about it.

Google disputed jurisdiction, and the High Court upheld the Master’s decision that:

1.      There was a tort of misuse of private information;

2.      There was also a cause of action under the DPA; and

3.      There was a reasonable prospect of success on one or both causes of action.

Google appealed on all three matters.

The Nature of This Judgment

It should be noted that both this appeal judgment, and the High Court judgment from which it arose, are not trial judgments. Both were judgments on preliminary issues, namely whether a claim form could be served out of jurisdiction.

Therefore, there is still a long way to go before any closure is obtained on this matter. As the case has not yet reached trial, the court will only analyse so far as is necessary to determine whether there is a real prospect of success. The decision may therefore have little bearing on the eventual outcome.

Issue #1: The Tort of Misuse of Private Information

To bring a claim against a non-EU defendant in England & Wales, you must satisfy the common-law rules of jurisdiction. In brief, unless the defendant is present or submits, this requires permission from the court (CPR 6.36).

First, you must establish a ‘jurisdictional gateway’ – the claim must be of a particular type (set out in CPR PD 6B 3.1).

Traditionally, misuse of private information has been ‘absorbed’ as a branch of the equitable law of confidence. Equitable actions are not provided a gateway in CPR PD 6B 3.1, and breach of confidence has (correctly) not been treated as a tort for these purposes. Torts do have a gateway under CPR PD 6B 3.1(9).

However, this has not been a happy situation. Courts have toiled with the clumsy situation of fitting breach of privacy into an equitable remedy that is fundamentally different in character.

Confidentiality and privacy are not miles apart, nor are they synonymous. Some of the key differences, which ultimately require that privacy should be a discrete cause of action in tort, are highlighted below.

Confidence

Breach of confidence has traditionally required a relationship or circumstance that ‘imports an obligation of confidence’, and must be related to information that has the necessary qualities of confidence (see Coco v Clark [1969] RPC 41).

It prevents and/or punishes persons for acting unfairly, eg by disclosing sensitive commercial information where there was an explicit or implicit equitable (in the broad sense) obligation. This is by nature an equitable remedy; it prevents one person taking an unfair advantage of another where the circumstances have imported or implied an element of trust.

Case law has developed what will equate to ‘information with the necessary qualities of confidence’, and, overall, it has developed logically. When you hear the term ‘confidential information’, phrases such as ‘need to know’ and ‘secret’ come to mind. Likewise, the courts have decided confidential information must, among other things:

·        not already be in the public domain (Saltman Engineering Co v Campbell Engineering Co [1948] 65 RPC 203);

·        have commercial value (Thomas Marshall (Exports) Ltd v Guinie [1979] Ch 227).

Privacy

The idea of privacy, whilst related to some extent, is fundamentally different. Privacy, in its ordinary meaning, relates to personal information, not ‘secret’ information.

Known to the public

Personal information may be known to a section of the public, but still be private. For instance, you may have published on Facebook, which only your close friends can see, that you are engaged, but you do not want the world to know. Confidence would arguably be defeated here; the information is in the public domain. Privacy, however, would not. It would be a breach of your privacy if a newspaper found out, and splashed it across their front page.

Value

Private information can have commercial value, but more often it does not. A newspaper’s knowledge of an ordinary person’s alcoholism has no commercial value, whereas knowledge of a celebrity’s addiction does.

Disclosure to the public

In the present case, there is arguably no disclosure. Whilst Google is using your information to enhance its commercial offering, it is not handing this information over directly. Thus, a claim for breach of confidence would, on the face of it, fail.

However, privacy does not require disclosure. Using private information to enhance a commercial offering without consent would be an abuse of your privacy, just as an unwarranted pat-down search at the airport might be an abuse of your privacy .

The Court’s Dilemma

The Court of Appeal faced an analysis similar to the above. Whilst misuse of private information has successfully been actioned through the law of confidence, it has been an uncomfortable ‘shoe-horning’.

The Court would likely have been able to shoe-horn Google’s misuse into the law of confidence as courts have previously. However, this would have blocked the claimants from bringing an action for misuse of private information because the claim form needed to be served out of jurisdiction. Thus, the Court finally had an incentive to make a clear distinction as ratio decidendi.

Courts have previously toyed with the idea of misuse of private information as a tort, but it has always been obiter. This issue was decided by a great deal of to-ing and fro-ing on whether previous judgments containing commentary on this issue were ratio or dicta. The Court decided in favour of the claimants, separating misuse of private information from confidence as a tort in its own right, on the basis that previous courts would have done so had the circumstances demanded it. However, the court was quick to clarify that they were not creating a new cause of action (at [51]):

This does not create a new cause of action. In our view, it simply gives the correct legal label to one that already exists.

Therefore, this is un-revolutionary, although the court did recognise ‘there may be broader implications from our conclusions, for example as to remedies, limitation and vicarious liability’.

Comment

This was a strong decision. Courts have found the shoe-horning of privacy into the equitable law of confidence uncomfortable. Although counsel for Google alleged ‘it has not caused problems before’, judicial opinion begs to differ!

 I  see judgment on this issue as a clear signpost for the way privacy law is heading. A discrete cause of action is another example of judicial support for the protection of privacy.

Further, misuse of private information has been recognised as a distinct cause of action, which should make it easier to establish, particularly where there is no clear overlap with confidence.

Whilst in many cases one already has a good cause of action through the DPA, this is not always the case.

Issue #2: Is BGI personal data under the DPA?

If browser-generated information (BGI) does not constitute personal data under s 1(1) DPA, the claimants would have no alternative cause of action to the misuse of private information.

Exact details of what BGI includes can be found in the Particulars of Claim, but to paraphrase, it includes metadata such as websites visited, and the time/duration of each visit.  

This data is then ‘linked’ to an individual browser by a unique identifier stored in a tracking cookie, which is placed on the user’s computer. Through this information, Google are able to target adverts to you: they classify you by keyword, and then offer advertisers the ability to target keywords.

The Meaning of ‘Identification’

Google advanced the argument that identification is shorthand for ‘identification by name’ – without access to ISP records, Google could not identify a user by name, and thus the claimants could not establish a claim based on s 1(1)(a). The Court disagreed. Identification essentially boils down to being able to single one internet user out from many billions.

Comically, Google suggested that this is an invalid argument as multiple users use the same machine. Clearly, in modern-day Europe most people access the internet on their own device, whether it is a phone or a laptop and this is what their targeted advertising service relies on.

Segregation of Data

Google also argued that a claim could not be established on s 1(1)(b) as, although in many cases they had identifying (by name) data pertaining to individuals that could be linked with the tracking data, they did not do so. The identifying data here would be account data, where the internet user has a Google Account (e.g. for Gmail, Calendar, Android etc.).

This argument relies on ‘identifiable’ meaning identifiable by name. The Court knocked this argument straight out. Google sought to rely on Recital 26 of Directive 95/46/EC, which the DPA transposes, arguing that it constrains a provision of the Directive. The Recital states ‘account should be taken of all the means likely reasonably to be used either by the controller or by any person to identify the said person…’.

As the court accepted that Google never linked this data, Google argued that this wording constrained the wording of Article 2(a) (s1 DPA). However, the Court held that a Recital could not constrain a provision. A provision with a wider meaning than a Recital must prevail. Metaphorically speaking, Articles are written in stone, Recitals are written on paper, with a pencil.

Therefore the literal meaning contained in s 1(1)(b) prevailed, namely ‘in possession of’ and ‘can be used’, which were clearly established, whether or not Google had any intention to use the account data for identification purposes.

Knowledge of Third Parties

The High Court judge relied on a third route of identification in his decision: other users linking the personal information to the claimants by using their computer, or ‘peering over their shoulder’ at the adverts being displayed.

Google contended this could not stand, as the knowledge of this third party is unlikely to come into the hands of Google. Again, this pertains to s 1(1)(b).

The claimants argued you cannot exclude third parties from the equation, and that the display of contextual adverts is essentially vicarious disclosure of the claimants’ private information to a notional third party.

However, I cannot help but feel the Court of Appeal copped out here, simply deciding that there was a substantive issue to be considered, thus leaving it for trial.

Comment

I have very little doubt as to whether the claimants will successfully categorise BGI as personal data at trial. It appears to me as one of those happy situations where the law must follow simple logic.

The BGI in this instance is fundamentally personal. Whilst one individual piece (eg one website visit) is arguably un-insightful into a person’s private life, many pieces collated together allow Google to build a comprehensive profile of that person, even if they do not link that person to a name.

I would therefore suggest that BGI in this context neatly fits into s 1(1)(a) – the data directly identifies an individual, even if not by name.

Issue #3: The meaning of damages in the DPA 1998, s 13

Whilst Issue #2 is resolved and, according to the Court of Appeal, there is a cause of action under the DPA, there would be no remedy for the claimants unless s 13  is read to allow damages for distress, independent of any financial loss.

The claimants are claiming solely for distress and they cannot establish even nominal financial loss.

Section 13(2) sets out when damages for distress can be obtained for breach of the DPA:

(2) An individual who suffers distress by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that distress if—

(a) the individual also suffers damage by reason of the contravention, or

(b) the contravention relates to the processing of personal data for the special purposes.

Plainly, on the natural reading of s 13(2), damages for distress are  possible only where the claimant also suffers ‘damage’ (i. pecuniary loss under s 13(1)), or where the contravention is for a special purpose (see s 3 – journalism etc).

The case law appeared to show that non-pecuniary damages are not available under the DPA (Johnson v Medical Defence Union [2007] EWCA Civ 262). The Court of Appeal found that this was not binding.

For the claimants in Vidal-Hall to have a remedy under the DPA, the court had to look to Directive 95/46/EC and assess whether the DPA is an effective transposition of it.

Article 23 states:

Liability

1. Member States shall provide that any person who has suffered damage as a result of an unlawful processing operation or of any act incompatible with the national provisions adopted pursuant to this Directive is entitled to receive compensation from the controller for the damage suffered.

2. The controller may be exempted from this liability, in whole or in part, if he proves that he is not responsible for the event giving rise to the damage.

This is unhelpful. It is not clear whether the intent of the European legislature was to include non-pecuniary damages.

What then, did the EU legislature intend? The Master of the Rolls and Sharp LJ considered Leitner v TUI Deutschland GmbhH & Co [2002] I-02631, which considers non-pecuniary damages under a package travel directive which also failed to express a position.

In that case, it was held that non-pecuniary damages were available as they were frequent and important in the context of package travel. In a package travel context, you are more likely to suffer from loss of enjoyment rather than any financial damage, thus non-pecuniary damages must be available.

Applying this principle to data protection, the Data Protection Directive relates to privacy, not economic rights, suggesting the more likely damage will be non-pecuniary. Further, reading Article 1 with Recital 10, the Data Protection Directive is based upon both the ECHR Right to Privacy and the equivalent rights under EU law, both of which frequently involve actions for non-pecuniary loss. Therefore, common sense suggests the Data Protection Directive must allow non-pecuniary damages.

The Court found that, despite this implied meaning, it appeared that the UK Parliament intended to restrict the meaning of damages in its implementation, and thus the Court could not interpret s 13(2) compatibly with Article 23 of the Directive (the Marleasing principle). The only remedy then available would be against the state (see Froncovich).

However, the Court relied on s 13(2)’s incompatibility with the underlying rights on which the DPA is based, namely the Article 7 & 8 rights in the European Charter of Fundamental Rights.

The European Charter applies where domestic bodies are implementing EU legislation, as was the case when parliament implemented the Data Protection Directive into domestic law in the form of the DPA.

This has proved a non-issue before for UK courts, who have refused to dis-apply domestic legislation on such grounds, as in the infamous prisoner voting case (R (Chester) v Secretary of State for Justice [2013] UKSC 63). However, the Court of Appeal distinguished the present case from Chester, as that case required the Supreme Court to make legislature choices that were outside of its jurisdiction (eg practical and administrative choices of how to go about permitting prisoner voting).

Here, the Court of Appeal did not need to make any legislature choices: by striking out s 13(2), there was a fallback onto s 13(1) which could be read to include all damages, whether pecuniary or otherwise. In essence, the Court did not need to read any additional provisions into the DPA for it to ‘make sense’.

Comment

Whilst I feel the Court of Appeal’s reasoning is sound, it opens doors for many more years of appeals. This mere application for service out of jurisdiction may well see the Supreme Court and the ECJ before it even reaches trial.

Given how long these things take, and assuming it is appealed all the way to the ECJ, it is unlikely a substantive trial will be held on this matter before 2018. By this point the new EU Data Protection Regulation will, we all hope, be in force, and the reading of the current Directive will be entirely redundant.

This is an undesirable state of affairs. However, it is all likely to be immaterial.

Conclusion: None of This Really Matters

Ignorance and web bowser settings

Currently, advertising platforms rely on the ignorance of their users. The vast majority of internet users are oblivious to what Google DoubleClick, and the many other services like it, are doing.

Therefore, whilst the decisions in Vidal-Hall  are all very interesting, until all web browsers ship with default settings blocking third party cookies and/or request that these services ‘Do Not Track’, the outcome of this case will have very little significance.

Further, Google and its competitors will likely continue to advocate that an opt-out is not valid unless expressly done by the end-user, rather than assumed for them by browser vendors.

Ultimately, therefore, the significance of Vidal-Hall will depend on two things, assuming the legislative position remains the same:

1.      Whether more browser vendors adopt a default position of defending their users’ privacy; and

2.      Whether a non-explicit opt-out of tracking has any legal effect.

This second issue may well be resolved at trial (and already has been in the US, where Google was fined for this same workaround), but the first is solely in the hands of browser vendors.

Sadly, by far the most popular browser is Google Chrome with a market share of 62.5%, and somehow I do not think they will be adopting a default setting of blocking tracking any time soon…

Future Legislation

The only solution is legislation that requires explicit consent.

The upcoming Data Protection Regulation, in its current draft form, proposes to require explicit consent from users for data collection.Data collection under the new Regulation is likely to require ‘freely given, specific, informed and explicit indication … either by a statement or by a clear affirmative action’.

How this will apply in the context of online tracking cookies remains to be seen. Practically speaking, browsers could not be required by the new Regulation to block tracking cookies by such a requirement; they are not the data controller, the organisation setting the cookie is.

But how else could Google obtain consent?

Of course, explicitly enabling third-party tracking in your web browser settings away from a default of disabled would be clear consent.

My problem here is that it may well be easy for Google to get around any new consent requirements, by cutting browsers out of the picture (so far as they do not disable tracking by default).

I dare say the vast majority of internet users use Google on a weekly basis. By merely continuing to use the search engine after seeing a prompt saying ‘by submitting a Google search, you hereby consent to our privacy policy’, you may have given explicit indication of your consent to everything contained within Google’s privacy policy.

Would this stand? Based on how consent under the 2011 EU Cookie Directive has been implemented, I think so. This ‘passive explicit consent’ is the most common implementation method of that Directive, and has received little criticism.

Thus, it seems that Google may well be too big to defeat without specific legislation requiring browsers (including Google Chrome) to ship with default settings blocking tracking cookies.

Chris Bridges is the Executive Editor of Keep Calm Talk Law. His interests lie in where Human Rights and Technology collide.