Fundamentally Giddy: An IFCLA Conference Report

June 14, 2016

Giddy. That’s probably the overwhelming feeling I had during two days at the IFCLA Conference.

Not just giddy from the spectacular views of London from the IET roof terrace, where drinks were served before the gala dinner on an equally spectacular (and rare) sunny June evening in England.

But giddy from the sheer range of really quite fundamental issues discussed by the assorted panellists, keynote speakers and of course delegates.

When I say fundamental, I really mean it. Not much can be more fundamental than Professor Mischa Dohler’s closing conjecture that the year 2400 could see hard working algorithms being hosted by a human for a holiday. While such statements may quite rightly be viewed as playfully provocative, as befits a keynote speech, they also hint at the more immediate changes happening in technology that, at the very least, challenge almost every legal code and regulatory process and, at the very worst, could see them bypassed altogether.

Many of the talks presented during the Conference were reflected in articles written for the special issue of Computers & Law that was published at the time of the Conference and which are available on the SCL website so this not the place to repeat what is already out there. Instead I would like to pick out a few of what I perceived were the underlying questions that will continue to exercise minds between now and the IFCLA Conference 2018 in Paris.

Given the wide range of topics & ideas covered, and the incontrovertible fact that I could not be in two places at once when the audience split for the streamed sessions (at least not yet), this is no more than a snapshot but I hope it gives you a flavour of what was an outstanding event. [i]

Should regulation come before technology? Or in other words, just because we can do it, should we do it?

This point was made most stridently by Professor Indra gennant Spiecker in her contribution to a session on the legal implications of robotic systems. Her main brief had been to look at the issues arising from their use in healthcare but it developed into a more general review of where we are with the notions of privacy and personal autonomy. For example, why do we so blithely send our highly personal health data, that perhaps even your doctor does not know, to a server in California when we have little idea of what use is made of it? And are we happy to leave it to commercial interests to develop systems that might become all the more powerful and controlling such as warning users about drinking even when it is legal? Might the answer be for the regulators to hold back permission for such momentous changes in the lives of the people until a consensus is reached on what to do? Going further she would like to see a Renaissance of government to manage these critical issues on behalf of democracy.

Although not as explicit, that same theme of the importance of regulation shot through Julie Samnadda’s talk on the Digital Single Market. I cannot report much on what she had to say given her position at the Legal Service Commission of the EC and the rapidly approaching EU Referendum (the elephant in the room as she described it) but it was suffused with the notion that regulation should sometimes deter certain practices and not just reflect what technology has created.

However, in the same session, Roger Bickerstaff, outgoing IFCLA President and Chair for the Conference, however was less certain. Such a regulated approach could lead to a creation of a ‘walled garden’ when it comes to European e-commerce: perhaps the transparency represented by the rating and reviews of Uber and TripAdvisor are better protectors of consumer rights.

Both Indra’s and Julie’s talks seem to symbolise the philosophical gap between common law and civil law systems and that reared its head in many sessions, particularly on anything to do with single markets or data protection. In many ways common law is the free market of legal regulation and the instincts of the common law native practitioner are at odds with the more rational approach of the civil law continental, who sees regulation as the answer.   

Of course nothing is as simple as that and Dr Joachm Schwerin rather muddied my ill-formed theory in his contribution to the FinTech session. As an economist responsible for policy & regulation for SMEs at the European Commission he is well aware of the possibilities inherent in alternative sources of finance for business such as crowdfunding, with their ability to allow people to follow their own interests not those of intermediaries. So he favours a more liberal approach of ‘laissez-faire within limits’.

Is the tech economy becoming too big or moving too fast to be controlled by existing state and regulatory boundaries?

As Professor Mischa Dohler reminded us in his Keynote Address to open Day 2 of the Conference, the global digital marketplace has enabled the creation of billion dollar businesses within months and with few employees: Instagram for example. Whether these valuations have more to do with the slick marketing machine of Silicon Valley and its propensity to issue a small chunk of shares at possibly over inflated value that do not reflect the true value of the underlying business is beside the point. Billion-dollar business have been created this way, often employing only a handful of people compared with old industry businesses.

Chris Holder, in the Robotics session, touched on the enormous social implications that such business models engender. In the US, regulators are already examining what can be done to tax businesses sitting on cash piles that are not being distributed through philanthropy or the employment of large-scale workforces. Emerging plans for basic incomes for citizens suffering reduced employment in the face of an increasingly robotic world are another consequence.

Mischa Dohler was also very worried about the trend for Google, Microsoft & Facebook to build their own trans-Atlantic data pipes & domestic networks, undermining the cherished notion of net neutrality and possibly undermining any form of control that states and regulators had over the telecoms providers who previously undertook such operations.  

Will data ever be secure enough?

This is a mission-critical question as we come to rely on increasingly sophisticated digital solutions. The problem is, as Mischa Dohler lamented, that there is no safe theory of data security in IT. In an era of increasing ‘criticality’ of data systems that is a worry.

The panel discussing blockchains at the end of the conference were fully aware that this technology, while seemingly secure for now, has two fundamental challenges.

Firstly, as Dr Jonathan Cave pointed out, a blockchain is not anonymous but pseudonymous. There is still a record of transactions in the blockchain database and an id attached to them. If you can match the ID of the user with a transaction ID then you have cracked it.

Secondly, the development of quantum computing could render the cryptography built into blockchains eminently hackable through the sheer, brute force of the processing power a quantum computer should provide.

If we agree that there is, as yet, no absolute data security then what are the implications for the data required to enable the Internet of Things? Could your web-enabled pacemaker be hacked? Or as Dr Anselm Brandi-Dorhn, as chair of the Robotics session, pointed out the weak point in data security in driverless cars is the garage you will take it to for servicing.  

Are privacy & personal autonomy dying concepts?

The consensus seems to be that the consumer cares more about free than freedom.  Why worry about what FitBit will do with highly personal medical data when it can tell you how many steps you have walked that day? If your life is made easier by Google suggesting restaurants as you walk round Rome then hey who’s not up for that? If that means that someone, somewhere knows where you are and what you are doing then what’s the problem if you’ve not done anything wrong?

Time and again we heard concerns about where the use of data is heading. In the session devoted to data protection and the GDPR, all panellists seemed to agree that their children do not blink when, in return for access to a free app, they are asked to tick a box stating that they have ‘read & understood‘ the terms of use and what the software developer is doing with their data. Richard Thomas, the former head of the ICO, thought one worrying consequence of this is that this clumsy mechanism is turning us into a ‘nation of liars’.

In the same session Nina Barakzai, from Sky, was more positive with a message that organisations which embrace transparency in data processing will be the ones that succeed in the future. For example, her team is investigating the use of meaningful icons to replace impenetrable legal contracts so that consent can become a clearer, more informed process. Such thinking could make data protection easier and therefore cheaper to implement and manage.

As for personal autonomy, I know what you’re thinking. Or at least there are plenty of corporations out there trying to do that. At the lowest level, this might be suggesting a restaurant as you walk past it. At more advanced levels, this would be ceding control to an algorithm or robotic system entirely. Professor Spiecker touched on the situation where a robotic nurse could care for a patient with dementia: how would you determine and regulate what’s in the interests of the patient? Even more difficult would be the situation where that patient is drifting in and out of capacity, as can often be the case.  

Are jobs and, closer to home, lawyers doomed?

A succession of contributors ‘crystal-balled’, either explicitly or in passing, the question of whether robots and AI will lead to less work for all, and for lawyers in particular.

Professor Richard Susskind opened the Conference with his customary comprehensive, insightful review of the brute inevitability of the rise of the AI machine. He mentioned Lex Machina, the patent database that makes better decisions than the judges, the emergence of self-regulating smart contracts and the increasing availability of tools to help litigants gain the legal knowledge they need to handle their own issues. While judgment is the tool that humans use to handle uncertainty, it may be that computers of ever-increasing processing power can handle it more effectively.

Not so customary though was his unexpectedly positive message towards the end of his talk to the effect that lawyers may disappear long term but there was plenty to keep those in the hall busy before they retire.

In the Robotics session, Dr Rob Buckingham was more upbeat, despite being, or perhaps because he is, a world leader in robotic systems. Two trains of thought powered this optimism. First, we still do not understand human consciousness and so cannot code it. Second, humans have a long history of adapting to technological change and creating meaningful work where the technology stops.

Jenna Karabdil, Immediate Past President of International Technology Law Association, backed up this more positive view about lawyer’s survival chances in a session looking at digital transformation of legal services. She showed a slide with the results of a study into which professions are most at risk of automation. The researchers had considered questions such as whether the professional needed to ‘come up with clever solutions’, whether the job requires negotiation and, somewhat surprisingly, whether the work ‘requires someone to squeeze into small places’. Once the figures had been calculated, the authors estimated that lawyers are 3.5% at risk of automation (though the figures for paralegals were nowhere near as comforting – over 50% if I remember correctly what was said in passing).

So there is the reassurance, if needed, that there should be enough lawyers around in 2018 to make the Paris event as refreshing as London 2016. That is as long as practitioners don’t suddenly downsize offices and start squeezing into small spaces.

David Chaplin is an SCL member and director of Bath Publishing, online law publishers:

[i] Apologies to any speakers who have not been name-checked or topics not covered. There are just too much to mention. Also I was not present at the Internet of Things, IT in Health Care or the Cybersecurity sessions for the very obvious reason that I was attending the other streams. So any omission of those in this report is regrettable but hopefully understandable.