Wearable Technology and the GDPR

February 17, 2016

A vast number of wearable and health wearable technological devices have entered the commercial market in recent years and predictions show that this trend is set to continue. Whether it be a fitness tracker, smartwatch or item of clothing, the amount of personal data that wearable devices can collect is phenomenal. Wired editors Gary Wolf and Kevin Kelly coined the term ‘quantified self’ in 2007 to describe a user of a wearable device who constantly collects, monitors and analyses the personal data that their wearable device(s) generates. A stand-alone fitness tracker can measure a user’s number of steps, calories burnt, distance covered over x time and quality of sleep.  This personal data is largely recorded in an accompanying application that is synchronised to a user’s smartphone. Alternatively the data may be uploaded to a user’s computer or accessed online if the data is stored in a cloud system.

A smartwatch, such as the Apple Watch, can perform similar functions to a fitness tracker but its overall capabilities are far more extensive. The press release announcing the launch of Apple Watch lauded it as Apple’s ‘most personal device yet’.

It is this ability to personalise wearable devices that makes them different from other portable computing devices. By their very nature wearable devices are worn on a person and thus data can be continuously collected via the wearable’s sensors whenever the device is worn. Any wearables that have an inbuilt GPS system or are connected to a smartphone can also determine a user’s location via the GPS technology.

While the personal data that wearables collect has great potential to tackle contemporary health problems such as obesity, data protection experts and organisations, including the Article 29 Working Party, have raised concerns about the level of data that wearables can collect, the ability to profile users, the ease of data sharing amongst users and the security of the device and its data. In general, in order to be able to use a wearable device, a user needs to consent to the manufacturer’s terms and conditions, including its privacy policy. A user has little option but to consent to the collection and processing of his personal data in order to use the wearable device. It is therefore important to consider whether the GDPR’s Article 23, Data Protection by Design and by Default, together with Articles 38 and 39, Codes of Conduct and Certification, will in practice combine with other provisions of the GDPR to help minimise the amount of personal data a wearable device collects and give users adequate protection and control of their data.

Data Protection by Design and Default

Data protection by design, often referred to interchangeably as privacy by design, is a strategy advocated by a number of Data Protection Authorities around the world, including the ICO. The concept, attributed to Dr Ann Cavoukian (former Information and Privacy Commission of Ontario), is embodied in the 7 Foundational Principles.[i] The Principles advise that data protection be embedded into design from the outset to create a culture whereby privacy is integrated into every project. Full functionality is advocated to ensure legitimate interests are treated equally, for example, privacy and security, such that neither interest is prioritised over the other. Further features of the Principles include security for the complete lifecycle of the data; privacy by default, which makes the default setting for data at maximum privacy; and respect for user privacy which companies can demonstrate by offering suitable privacy defaults and appropriate privacy notices.

Until now the adoption of data protection by design and default has been voluntary and a matter of good practice. The agreed GDPR will however make it necessary for each data controller to consider ‘having regard to the state of the art and the cost of implementation … appropriate technical and organizational measures’ to fulfil this requirement. Recital 61 of the GDPR gives a clear explanation of how controllers can achieve this objective, advocating that internal policies are adopted and certain measures are implemented. A non-exhaustive list about the actual measures which can be taken is also provided and includes pseudonymising personal data as soon as possible, permitting data subjects to observe what data is being processed and improving security features. Controllers are to encourage the producers of products, services and applications that ‘are either based on the processing of personal data or process personal data to fulfil the task’ to consider a user’s right to data protection in the development and design process of their products, services and applications.

The recognition of the impact various stakeholders contribute to a controller meeting their data protection obligations is crucial and highly relevant to the wearable ecosystem which is a complex network. Stakeholders involved in the wearable sector include operating system and device manufacturers, app stores, app developers and social media platforms. Users themselves can be classified as another stakeholder and need to be responsible for the information that they disclose and share.

Recent reports highlight how difficult it can be in the mobile computing and wearable environment to ensure that an app is transparent and compliant with its privacy policy as well as the app platform’s developer agreement. A study by Imperial College London, published in September 2015,[ii] found that some apps approved by NHS England’s Health Apps Library did not meet data protection standards and collected and shared far more data than was allowed for in their privacy policies, many of which were poorly written. Another study published in Technology Science[iii] in October 2015 similarly highlighted how both Android and Apple apps leaked personal data. Apple’s apps reportedly shared users’ location details, name and email address. According to the study in Technology Science, the specific apps were analysed in June and July 2014 at which time Apple had not publically adopted its privacy by design strategy. Nonetheless it was widely understood at the time that the acceptance standards for Apple’s App Store were far more stringent than alternative app platforms. Given the vast number of apps on any app platform, and even assuming the assistance of machine technology, how robust are the checks and balances? Both of these reports demonstrate how difficult it is for users of wearables to be confident that the apps they engage with via their devices actually protect their personal data.

The GDPR will require stakeholders to consider more rigorously the concept of data protection by design and by default. The GDPR requires stakeholders to take a risk-based approach to data protection by design and by default. This approach is unfortunately not as forceful as the regulation could have been. However, in order to distinguish and underline their commitment to this concept, controllers can use ‘data protection certification mechanisms and … seals and marks’ to visibly show data subjects that they have complied with their data protection obligations in this area. In an age where branding is so significant to consumers, endorsements of a company’s data protection standards by way of certificates, seals or marks, may be extremely significant to gaining data subjects’ loyalty and trust.

It is interesting to consider how in practice the concept of data protection by design and default will be communicated to all stakeholders in the wearable ecosystem. In its Opinion WP202, the Article 29 Working Party considered that app designers were the biggest risk to data protection due to their lack of understanding about the subject. Controllers will ultimately have the responsibility to ensure that developers are aware of and employ data protection by design and by default. By adopting a more agile approach now, rather than waiting for the GDPR to come into force, controllers can begin to change the culture and habits of those in the ecosystem who are at present either unaware of this concept and/or are not using it. In January 2015 ENISA published its report Privacy and Data Protection by Design – from policy to engineering[iv] to highlight how technology can be used alongside the legal framework to achieve data protection by design. 

Already the market is seeing companies use suitable data protection policies and best practices, such as data protection by design and by default, as a competitive differentiator. While there is no one absolutely secure system to fully protect data, the impact of any data breach which does occur will, in theory, be lessened due to the reduced proportion of data that is collected. The GDPR explicitly requires that ‘only personal data which are necessary for each specific purpose of the processing are processed’. This obligation extends to the amount of data collected, the level of processing, the storage time and the accessibility of the data. Additionally, in order for data to be released to ‘an indefinite number of people’, the data subject needs to intervene and override the default position.

The stronger enforcement provisions in the GDPR will arguably affect how seriously controllers and processes take the new Regulation. With the ability to impose corrective powers and/or administrative fines, supervisory authorities will have a number of options at their disposal. Whether or not a controller or processor has given due consideration to data protection by design and by default will be taken into account when an administrative fine is contemplated and thus the benefits of adopting this strategy where applicable should be abundantly clear to controllers.  

Data protection impact assessment

Conducting a data protection impact assessment will assist controllers in assessing the level of risk their data processing poses to an individual’s rights and freedoms as well as help to determine the appropriate technical and organisational measures which need to be adopted in order to manage those perceived risks. Controllers will need to consider the actual processing operations they will undertake as well as the actual purpose of the processing, its necessity and proportionality. Under Article 33, controllers are required to undertake a data protection impact assessment where the method of processing involves new technology and is likely to produce a high risk to an individual’s rights and freedoms. Focusing a controller’s attention on the purpose of the processing will inevitably require the controller to consider the personal data that will be collected. Consequently, the amount of data collected may be further reduced by this assessment.  

Codes of conduct

A further way for the wearable sector to embrace the values of the GDPR and best practice will be to adopt a code of conduct (as provided for in Article 38) that can be drafted by associations and representatives of specific sectors. The codes can be used to demonstrate that controllers and processors have specifically considered the relevant provisions of data protection by design and by default. Controllers and processors who are not subject to the GDPR can also adopt codes of conduct in order to evidence that there are appropriate safeguards in place in respect of data transfers to third countries or international organisations. The Commission has given considerable support to codes of conduct and will, in certain circumstances, give publicity to approved codes. This endorsement will provide added value to stakeholders in the wearable sector who firmly prioritise the protection of users’ personal data. How straightforward it will be in practice to arrange and implement a code of conduct remains to be seen and interested parties may want to follow the development of the Code of Conduct for mHealth applications. Codes of conduct may well be time and investment intensive, although time spent up front collaborating and undertaking this exercise may be extremely beneficial to building consumer trust in an expanding wearable market. 


The GDPR, once in force, will provide a cohesive legislative framework to strengthen the protection of an individual’s personal data in a digital society. The adoption of data protection by design and by default certainly has the potential to limit the amount of data wearables have the potential to amass and process. What is crucial for manufacturers of wearable and IoT devices is that they ensure their whole ecosystem maintains the same standard of data protection. Anything less could seriously affect a controller’s compliance and reputation.

It is considered that the GDPR may well have a similar longevity to the Data Protection Directive 95/46/EC. Technology neutral, the GDPR’s objective is to be ‘future proof’ – a difficult task given the acceleration of technological change. Thus, as the wearable and IoT environments gather pace and become even more integral to data subjects, it is important that regulators consider whether any amendments are required and, if so, that these are done in a timely fashion. This will ensure data subjects’ personal data continues to be protected at the same level as the GDPR offers in its agreed form.

Data protection has certainly become centre-stage and consumer awareness about it is perhaps greater than at any other time. One aspect that the legal framework of the GDPR does not address is the requirement that data subjects actually understand the privacy policies which describe how their data is collected and processed. While this may well be impossible, given the vast range of users, the above methods do go some way to robustly protecting a subject’s data. Furthermore, stakeholders can adopt a Clear English Campaign strategy for their privacy documents.

Now is certainly an opportune time for all stakeholders in the wearable ecosystem to take a proactive approach, reduce any excessive collection of data, adopt best practices and plan how to implement the objectives of the GDPR.  

Lorna Cropper LL.M., CIPP/E is a Solicitor specialising in IT, IP and Data Protection law. Twitter @LornaLCropper

Note that reference to the GDPR text in this article is to the unofficial agreed compromise.

[i] 7 Foundational Principles

[ii] www3.imperial.ac.uk/newsandeventspggrp/imperialcollege/newssummary/news_25-9-2015-10-0-3

[iii] Zang J, Dummit K, Graves J, Lisker P, Sweeney L. Who Knows What About Me? A Survey of Behind the Scenes Personal Data Sharing to Third Parties by Mobile Apps. Technology Science. 2015103001. October 30, 2015.

[iv] https://www.enisa.europa.eu/activities/identity-and-trust/library/deliverables/privacy-and-data-protection-by-design