Augmented Reality Too

October 8, 2013

In my last piece I promised to get to the point of why augmented reality is a big issue for technology lawyers and cover some of the topics I have encountered dealing, predominantly, with clients operating in this field.

Terminator[1]

John Connor: (Terminator 2; Judgement Day)  “Mom! We need to be a little more constructive here, okay?”

When image sharing websites were first launched and then social media sites sprung up like proverbial mushrooms, legal doom-mongers suggested that data protection would be their downfall.  Nope.  At a recent SCL talk on AR, Peter Hall  of Taylor Vinters posed the question ‘what if “Google Glass” collects data and the wearer decides to share that data with friends, some of whom are overseas’? Is this a data protection issue? Technically, yes.  Many social media users actively disclose personal information rather than seek to protect it and endlessly re-Tweet images they are in, they have taken or which simply seem interesting (to them). So would a person object to export of information collected informally – or even possibly inadvertently via technology such as Google Glass? Probably not. In that case, as lawyers and as individuals, do we really need to concern ourselves with this?

The trouble is that it is not just Google Glass and worn on the body AR like the iWatch and Android watch phones that collect data of course, it is the AR systems themselves that collect personal and big data, in order to learn from it and improve the system offering. Images, videos, shopping habits, biometrics and personal preferences recorded by an AR system would almost certainly fall under the definition of ‘data’ within the meaning of the Data Protection Act 1998 (DPA).  Even if it is collected by an individual (in the case of an AR device wearer) and tagged with a comment for retrieval later, it forms part of an accessible record and is data. This being the case, permission is needed for the collection and processing of it. But from a practical perspective can it be sought and how will this be done? 

Consent at point of collection should be built into the software, but isn’t the AR provider just a controller and what of fair processing of anonymised data once the customer has a version of the original data themselves? What if the customer enhances it, changes colours, adds to or alters the image then stores or circulates it but the retailer or data processor holds an original, which is now a totally separate piece of data with no automatic link back to the customer? Can a customer decline consent to the use of one or both or more versions and still use the AR product? What about opt out and destruction of data in future? Is that actually practical?

I do not believe it is safe to argue that download of an app or use of an AR program alone infers consent and usually I recommend an opt in, but as Phil Lee said in his recent (and thought provoking) article for SCL (‘A Brave New World Demands Brave New Thinking‘), disclaimers and opt-ins are lazy tools for lawyers; workarounds and patches instead of fault fixes.  The practicalities of how consent can be imputed must be addressed by the law and the legal community. Perhaps via a general consent app on a device, which the AR program scans for or refers to for consents before permitting download and image sharing? Perhaps by a system of tagging which alerts the person tagged to use of an AR containing their image? All suggestions welcome, please.

Minority Report[2]

Dr. Iris Hineman: “Sometimes in order to see the light, you have to risk the dark.”

Every so often someone will not want to participate with AR and targeted marketing and will want ‘real’ rather than augmented.  One argument often used in favour of less stringent privacy runs like this; ‘what if the only way to save someone’s life is to breach their privacy, would that person really mind if it was released?’ This argument is emotive and, I suspect, largely academic  – and where a form of AR might be used in halo MRI or similar, the answer seems obvious. But much AR is life enhancing rather than life saving, in which case can we say ‘no’ to its use for us and on us? 

How can an AR nay-sayer be ‘unplugged’? The answer is they probably can’t. Currently our law provides inchoate protection for personal privacy law and whilst the DPA and the ECHR might apply, common law is still developing  – feeling its way in this area. DPA opt out might seem an obvious answer, but even people who have successfully extricated themselves or their property from images on Google Earth might ask “was it worth it?” Of course it probably was in terms of the message it sends against blanket encroachment of personal privacy, but individuals can’t fight every privacy battle against every corporation. Perhaps an Advertising Standards Authority or Direct Marketing Association equivalent will intervene with a universal AR opt out, or someone not wanting to participate may simply download a blocking app for AR content directed to them. The latter is an eminently practical solution rather than using the expensive legal solution of litigating every AR instance. But what of choice and the ability to record a desire to go back ‘on grid’? Can we change our minds once we make them up? Use of bodymetrics to recognise individuals might in fact assist with who wants and who does not want AR  targeted marketing. Bodymetrics is the Holy Grail in some AR circles as it has so  many possible uses. How ironic then that what could be used as a tool to ensure privacy might be stymied if privacy and data protection laws are applied in a knee-jerk manner to it.

The Matrix[3]

Agent Brown: “Perhaps we are asking the wrong question.”

As Ashley Hurst and Iona Millership pointed out in their recent article for SCL (http://www.scl.org/site.aspx?i=ed32924), English law does not recognise personality rights as such. Legal battles based on personality and persona have had to be framed in terms of passing off, breach of privacy or confidentiality and seem contrived. Whilst Guernsey may still be blazing a (faint) trail in terms of image registration, other jurisdictions have historically done more to protect the rights of personalities to their own image in terms of persona, copyright and trade marks.

A Google Glass wearer takes a snap of another person; they like their look, they like their looks or they don’t like them at all. Then, the wearer tags the person they believe they have snapped in a post on Facebook, placing them in a place at a time, perhaps not entirely to the subject’s liking (haven’t we all cringed at tagged posts of ourselves on FB?). But what if that person tagged is not who the wearer thinks it is? What if someone believes that the image they have is of a personality, whereas it is not. In copyright law, false attribution may prevail, but the principle does not extend to false attribution of a personality.  Protection of elements of a ‘personage’ (to borrow from Guernsey’) don’t just extend to the AR world, but the opportunities for alteration of an image, for ‘augmentation’ of it using AR are endless and as lawyers we should be considering how the original non-augmented person should be protected and how the AR version is dealt with too.

The Sixth Day[4]

Adam Gibson: “I might be back.”

Another ‘what if’ arises if an original image has been augmented. Who then owns what rights in the original version and who owns rights in the augmented one? This is a question I am always asked by clients providing AR and by their customers receiving it. The example I tend to use is of photos, which 100 years ago were considered inferior copies and not protected under copyright; then were recognised as having a sufficient degree of skill and endeavour to warrant protection, and which now tend to form two distinct threads; art and snaps, which are arguably copyright works and not copyright works respectively.

I call this ‘The Sixth Day Dilemma’ because in this film Arnie once loses his memory and is cloned again. In this situation, the AR ‘cloning’ of something creates an entirely new work in which the AR provider owns copyright both in the code and the later version and the non-physical ’embodiment’ (such as an AR installation or stereoscopic projection as a work of art). Treating the AR output as an intellectual property asset in this way enables us to assign or retain it via the contractual arrangements and, crucially, control use and exploitation downstream, although this is increasingly controversial.

As a general point of principle and to ensure a portfolio of rights going forward, I advise my clients to retain ownership of all the software that drives their AR program and the output from it. We achieve this with varying degrees of success in terms of retention or financial compensation for assignment (particularly in the David and Goliath fields of medical applications and retail). In all cases, my clients disclaim liability for any use of the program in out-of-spec ways and for use of the output from it – likening their supply to service providers and search engines.

Blade Runner[5]

Deckard: “Replicants are like any other machine. They’re either a benefit or a hazard.”

The benefits of AR are many, but the rights they create are complex, particularly as they give capacity to reinvent something which seems real, but is fundamentally not. This begs a question: how do we distinguish between AR and true reality output ethically and do we in fact need to?

Lawyers are not known for their grasp of ethics and, in the new digital age, are ethics really a consideration provided we mark AR as such? Aren’t the benefits of AR just technological advancements like any other device which improves our lives, our well-being or our human experience? Of course AR is all that but, whether in the context of an application of law to a product, a law to ensure product safety, a law to govern a user’s expectation or a voluntary code of conduct, modern real life requires regulatory boundaries and parameters and so does AR. Not only must statute be applied, but commercial codes in countries where the AR is deployed (worldwide in the case of internet AR), voluntary codes, professional standards and membership organisation regulations must also all be complied with. It is worth noting that in the global economy overly protective consumer rights and business codes do nothing to protect technology businesses and even codes which may seem well intentioned (such as the California Commercial Code and the warranties it implies to software) can restrict trade when an out-of-state operator wishes to deploy new technology there. A warning then to colleagues is to research, check and take local legal advice about common and obscure laws if AR is to be deployed or made available outside of the EU.

Avatar[6]

Jake Sully: Everything is backwards now, like out there is the true world and in here is the dream.

Various advertising voluntary codes in the EU and elsewhere share a common thread requiring advertisers not to misrepresent or mislead customers about the products they sell. Is AR the greatest misrepresentation currently available and is it regulated? 

To comply with the UK ASA requirements, all AR images (whether 2D or 3D) must indicate that they are computer generated images – or like the new Fiat 500 advert, that they are ‘fiction: not real’. AR used for marketing is not ‘puffery’ and thus outside the code; it is real, but better, and so it does fall within the scope of ASA regulation.

None of my clients has yet had a run-in with the ASA and none of their customers has asked for a ‘disclaimer’ of reality, but strictly speaking they should. My clients are far more creative than me, so after writing this I will tell them what is needed and leave them to find a way of adding in such references without a bland and obvious statement such as ‘not real’ ‘fiction’ or ‘i’!

In the AR development agreement, I always place responsibility for compliance with codes of conduct and industry specific regulations onto the AR customer and a requirement for display of the relevant code or a link etc. to it or guidance on it as integral to the specification. I never permit my clients to give warranties in relation to such codes or any indemnity which may refer to them.

Incidentally, I do wonder whether ASA requirements would apply to aspects related to AR such as 3D printing, where ‘real’ is not the same as original or manufactured? Like Sully says, ‘everything is backwards’, with compliance driving design, rather than efficiency.

Robocop[7]

RoboCop: Come quietly or there will be.trouble.

RoboCop is set in near-future Detroit, which has been financially crippled by unchecked crime; if the nature of the crime had been non-practising entities rather than Chapter 9 bankruptcy, this film would be prescient. In the US, 85% of patent claims made are filed by non-practising entities (NPEs) and these are estimated to cost the economy a whopping $29billion[8], a truly crippling sum to waste on extortion.

As software companies jettison parts of their business that were once core and move in to new areas, they often try to commercialise their old patents using troll techniques and recover some of the development spend through settlement and licensing. As a business methodology, it seems pretty infallible:

1. Find an outrageously wide business method patent.

2. Assign that patent to a ‘commercialisation agency’.

3. Demand payment of money at a level just below the amount required to defend a patent infringement case.

4. Rub your hands gleefully as the respondent agrees through gritted teeth to pay up rather than incur ridiculous litigation fees without any prospect of recovery against you even if they win.

In the land of the brave and free (and this month, the land of closed government), in any emerging field, where there is a method, there is usually a pre-existing US  patent and hairy, hoary troll just waiting to hammer creativity and innovation. There are currently over 250 patents in the US concerning AR methods; once the AR wars begin, the phone wars will seem facile.

In the last of my articles on AR, I will discuss my experiences and the horror my clients have endure facing down US patent trolls and whether the groundswell against NPEs may come in the nick of time to benefit the entire AR field. 

Joanne Frears is a Solicitor and Head of Intellectual Property at Jeffrey Green Russell Limited.



[1] Director: James Cameron.  Writers: Cameron, Gale Anne Hurd William Wisher, Jr., James Cameron

[2] Writer: Philip K Dick. Director: Steven Spielberg

[3]Writers and Directors: Andy & Lana Wachowski

[4] Writers: Cormac Wibberley, Marianne Wibberley, Director Roger Spottiswoode

[5] Writers: Hampton Francher, David Peoples [Philip K Dick: Do Robots Dream of Electric Sheep?] Director: Ridley Scott

[6] Writer & Director: James Cameron

[7] Writers: Edward Neumeir & Michael Miner. Director Paul Verhoeven

[8] James E Bessen; Michael J Meurer; Jennifer Larissa Ford, Boston University School of Law; Law & Economics Research Paper No 11-45