The Price of Convenience

Lillian Pang and Peter Lee ask some difficult questions about data protection and privacy regulation, the effects of the ‘hoovering-up’ of data by all manner of applications and the extent of the real demand for privacy in the use of technology

Do you remember when the transition happened from using technology as some mere tool confined to the workplace to it becoming a convenient tool that we carry around all the time?  The transition didn't happen overnight but in that period of transition we gradually enabled the erosion of our privacy by silently acquiescing to the drive for interactive devices; devices that enable us to talk to each other freely and allow us to track what our friends do or where they go, devices that allow us to navigate to shops we have never been to or find discounts on branded items.

The impact of silent acquiescence

Today we crave the ability to check emails on the go, get weather reports at our fingertips and read news of all kinds at a button's touch.  We wanted our TVs to be interactive, to select programs on demand and be able to rewind or fast forward television programs without having to watch the dreaded advertisements -- and that happened. Our service providers provided these abilities and we are all ecstatic at the thought of being able to control what we do, how we do it and when we do it. All this came seamlessly but, unknown to us all, it came at the price of our privacy.

To try to keep up with the speed of technology, the European Commission released its draft of the proposed Data Protection Regulation legislation in early 2012 with the aim of strengthening and safeguarding our online privacy rights. But does the European Commission speak for each and every European citizen when it comes to what has become the almost fundamental right to use laptops, tablets, mobile technology and the like on a daily basis?

Economic realities of regulation legislation

There are elements of the proposed new Regulation that impose more obligations on the data controller and data processor to ensure the safeguarding of personal data. This extended scope also impacts non-EU companies that control or process the personal data of EU citizens. Such companies may soon be subject to the new law. Furthermore the proposed Data Protection Regulation limits the ability of companies to profile users of its services automatically, requiring the prior express consent of such users. Clearly the new Regulation covers many areas of privacy and the overall intention to protect personal data must be acknowledged and endorsed. From an economic perspective the reality of companies endorsing a proposed Regulation may stifle business growth and profitability, which leaves many businesses concerned.

The fact is that in order for any of our current technology devices to work we have to compromise our personal privacy. For example, we allow app providers to download and acquire full network access for our mobile devices (smart phones and tablets) to the point where we accept these without questioning it. 

As we look to bolster our privacy under the shadow of the EU's continuing debate on the new Data Protection Regulation, the overriding principle seems to be that users need to explicitly consent to the use of their personal data. The question remains: should the obligation on the service provider be limited to engaging in tick box exercises or should the service provider have an obligation (whether legal, moral or ethical) to explain to users exactly what is captured and what data collected is used for (as opposed to hiding all this in its privacy policy).

Is it time to question access privileges?

Privacy policies tend to capture this information but in a broadly generic way such that the user is no more informed reading it than not. Have you ever questioned why the app provider needs full network access and permissions to your device and what this means in practice? Certain software changes are required to make the app accessible but why do a high percentage of apps require full access to your contact list, calendar or even the operating system on your device? Should we as consumers be content when some service providers may scan our e-mails and use the information gathered for their own commercial purposes? Should the choice just be 'don't download the app if you don't like the terms'?

Let us briefly consider how this impacts on us in the work place and why we should be concerned. With the increase in the principle and practice of Bring Your Own Device (BYOD) in the work place, when you choose to use your personal device or laptop as your work machine you may be opening up your personal data for your employers to see. Employers don't want employees leaving with company data, and employees equally do not want employers nosing in on their personal data. There is no current platform that sandboxes or contains the data to be distinguished.

IBM's recent acquisition of Fiberlink is apparently aimed at bringing privacy by design to the world of BYOD. It will be interesting to see whether IBM buying this privately held mobile management and security company will further promote BYOD in the work place. Furthermore, for apps downloaded onto a mobile device, IBM hopes to automatically vet those apps that request unnecessary access to contact lists, calendars and the like.

A new era of privacy by design

It seems we are well on the path towards 'privacy by design' in some areas of technology. Take smartphone and other connected device companies that exist within a rather complex and competitive ecosystem. Companies at all levels of the value chain, from OEMs to downstream suppliers into the ecosystem such as chipset and software vendors, struggle to obtain and collect data on the use and implementation of their technologies in a live environment.  It is technically possible to obtain information remotely about device activation, the version of the supplier's technology in use, geographic location, user activity and other such data. This type of data can help deliver critical competitive advantages and inform market strategies for companies such as their return on marketing investments and time to market information. However, if this type of data is harvested in a manner that is not legal compliant then the methods can spectacularly backfire.

It is worth briefly examining some of the legal issues around this type of data collection activity by technology companies and how an end-user's rights might be infringed.

As an example let's take a downstream supplier to a connected device OEM ('X') who wishes to collect information on device activation and the use of its materials. The extent of X's privacy obligations under current English law will depend on whether X is able to use the data it collects to identify the end-users of the devices or any other living individual. Personal data is defined under the Data Protection Act 1998 as 'data which relate to a living individual who can be identified (a) from those data, or (b) from those data and other information which is in the possession of, or is likely to come into the possession of, the data controller....'. The DPA will not apply to X if it cannot identify a living person from the data collected. However, if X holds any other data which, when combined with the data collected, could identify a living person then the DPA will apply and with it the various obligations it imposes on data controllers plus the spectre of fines and adverse publicity risk for X's non-compliance.

Location, location, location-based

What about location-based privacy issues? Regulation 14 of The Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR) states that the processing of location data is permitted only if the user cannot be identified from that data; or the processing is necessary for the provision of a value-added service, with the consent of the user. If data that cannot be used to identify a living individual is not caught by the DPA this is also true of the PECR. If however a living individual can be identified from the data, X will need to comply with fairly onerous obligations in relation to obtaining the consent of that individual to the use of geo-location services.

The LG case

This issue of how we consent to sharing our personal data, choices and tastes has been highlighted in the recent faux pas committed by electronics company LG. One LG customer revealed that his LG Smart TV was feeding data back to LG every time he changed the channel. The connected TV was also scanning all shared files on the user's home network and sending a running total of those back to LG as well. LG allegedly offered an opt-out of 'Collection of watching info' in its options menu, but apparently toggling the opt-out didn't do anything. Furthermore, it seems that all the data was unencrypted, so anyone with access to the network could openly view the information. This incident begs the question of whether companies in this tech space should be doing more than merely providing the option of opting out if consumers are not generally aware of the data collecting functions of the devices they purchase.

Interestingly the proposed Data Protection Regulation provides that a company's ability to automatically profile users of its services will be limited and that such companies will require the prior explicit consent of the individual whose data it intends to process. So perhaps LG would be compelled under the new proposed regulation to have explicitly sought consent of its consumers prior to engaging in profiling.

Data democracy for the people

What if LG chose not to explicitly seek consent or has already undertaken the profiling, how does this additional right benefit the user of their services? The recent purchase by Google of connected home device maker Nest for $3.2bn will mean 'infinitely more intelligent' devices according to Google's Executive chairman Eric Schmidt, but at the same time it raises ever more complex questions around data use. Do we as consumers simply accept the fact that data is fast becoming a priceless commodity that cannot be regulated?

The proposed new Data Protection Regulation seeks to introduce many new areas of control, although in a spasmodic fashion.  Ultimately is the EU ever going to be able to legislate to safeguard the way individuals have chosen to live their lives under the scrutiny of technology, its providers and governments? We wonder whether any type of legislation that has to continually catch up with the pace of global technology is ever going to perfectly address the question of privacy.

All this strikes us as more of a moral debate than a legal one. We live in free countries: have the people spoken by virtue of their choices? 

Lillian Pang is Legal Director at Rackspace®, the global leader in hybrid cloud and founder of OpenStack®, the open-source operating system for the cloud. 

Peter Lee is a committee member of the SCL Technology Group and a solicitor in Taylor Vinters LLP's Commercial & Technology team.

 

Published: 2014-02-10T10:42:20

    0 comments

      Please wait...