Is 2014 the Year UK Privacy Law Catches up with Mobile App Developers?

March 27, 2014

In January 2014 a new term entered the national vocabulary (at least for a short while): ‘leaky apps’. The phrase came courtesy of Edward Snowden, as the latest revelations from the former NSA whistle-blower showed that ‘leaky’ mobile applications were providing a rich harvest ground for spy agency efforts to collect individuals’ personal data. Though this provided a good opportunity to berate the security services again, the more interesting theme of this story was not that the security services had the ability to intercept data sent from mobile devices to app providers, but rather the breadth of personal data available for interception in the first place (given the amount of data individuals were sharing with app providers) and the ease with which it was able to be intercepted. 

This article examines the legal regime surrounding the sharing of personal data across the mobile Internet, focusing in particular on the ‘Privacy in mobile apps guidance for app developers’ recently published by the ICO. We ask whether 2014 is the year when businesses involved in the development of mobile apps begin to have to take UK (and ultimately EU) privacy law seriously. 

‘Leaky apps’ – A mobile user’s perspective

The Guardian posed a question in its report about the interception of mobile data: are consumers aware of the amount of personal data they are sharing from their mobile devices? An interesting answer was provided by a recent YouGov survey, cited by the ICO when it launched the Guidance: 49% of mobile users had chosen not to download an app due to worries regarding the protection of their personal information. To flip this statistic on its head, it would seem that 51% were unaware of what they were being asked to share or did not allow privacy concerns to get in the way of the convenience and enjoyment mobile apps provide.

Regulatory scrutiny  

Mobile apps in the UK have historically been regulated by UK privacy law, notably the Data Protection Act 1998 and the Privacy and Electronic Communications (EC Directive) Regulations 2003. However, whilst the evolution of the mobile app industry has not gone unnoticed by the regulators, it is not an area that has previously drawn widespread ICO attention. There may be two reasons for this, the first being a question of the ICO prioritising its resources in light of public concern and the second being that it may be difficult to establish when an app developer has broken the law.  

If a consumer is concerned by an app they can always delete it or not download it in the first place. Moreover, YouGov’s poll suggests that many people may not be particularly concerned with the privacy implications of downloading apps. Therefore, even if app developers are failing to comply with their legal obligations, their behaviour is not overtly intrusive, compared with, for example, nuisance calls or spam texts, the receipt of which prompted over 53,000 complaints to the ICO in six months (according to a December 2012 enforcement activity report) and is therefore, understandably, of greater concern to the ICO.

This is where the Guidance aims to assist, not by making it easier for consumers to complain, but by explaining more clearly to app developers how they can comply, particularly by encouraging ‘privacy by design’ at the outset of a mobile app’s development. It is worth noting at this point that the Guidance (like all ICO guidance) is not an authoritative statement of UK law; however, both the English courts and the ICO must take account of its provisions.

Compliance with the Guidance would be persuasive evidence in any investigation by the ICO as to a developer’s compliance with the DPA, breach of which can result in both criminal and civil sanctions (including penalties of up to £500,000 for serious breaches), and inevitable negative publicity/damage to reputation, for the developer in question.

Whilst the sanctions currently available under the DPA may not be of particular concern to large multi-nationals (one of the aspects the draft Data Protection Regulation is seeking to strengthen), reputational damage may be very harmful to an app’s success (regardless of the size of the developer), given that they are often convenience rather than ‘must have’ items combined with the fact that low barriers to entry have made the app market very competitive. Developers of all apps destined for the UK market (whether for smartphones, tablet computers, smart televisions or games consoles) would therefore be well advised to take steps to review the security of their existing apps and, perhaps even more so, those in development, in order to ensure that any vulnerabilities be addressed as soon as possible.

What guidelines have been available to app developers previously?  

Whilst voluntary guidelines previously published by various non-governmental organisations have long been observed by many ‘privacy responsible’ app developers, the extent of compliance is unknown, in particular in the context of small start-ups and individuals unfamiliar with both the data protection laws and various complementary guidelines to which they are subject. In any event, app developers which were aware of relevant voluntary guidelines would not necessarily have felt compelled to invest time and money in complying with them because doing so could not guarantee compliance with the UK privacy laws. One might say there has been some reluctance or certainly lack of impetus to even try to comply given that such rules/guidelines have not been actively or forcibly enforced: ultimately businesses developing apps have determined that the risk of being found to be non-compliant is minimal and, if they are, the loss (financial and reputational) is similarly minimal. 

ICO Guidance

Who needs to comply?

Developers of mobile apps need only comply with the DPA and corresponding Guidance in respect of each app that deals with ‘personal information’. In the mobile app context, ‘personal information’ is data that alone, or together with further information collected by the same (or an associated) app developer, is capable of identifying a living individual. As suggested by the practical examples given in the Guidance illustrating what constitutes ‘personal information’, the definition captures a wide range of information from typical/obvious identifiers, such as names and addresses, to identifiers of specific relevance in the app environment, including the device’s telephone number, International Mobile Equipment Identify (IMEI) number, and Media Access Control (MAC) address. Broadly speaking, any information that could be said to treat an individual differently from another is likely to constitute personal data and, as suggested by the Guidance, if the position is unclear, it is likely to be simpler from the developer’s perspective to treat the data as ‘personal’ from the outset.   

Where personal data is likely to be processed, developers should also consider whether they would be the ‘data controller’ with respect to each of their apps at any point of time in its life cycle, as this will confer legal responsibility on them to ensure compliance with the DPA. The data controller, in this context, is the person or organisation that is in control of determining how the app user’s personal data will be dealt with.  This crucial determining factor is, however, not always clear in the mobile app environment where various parties are often involved in the development, funding, or ongoing operation of any given app. In providing practical examples as to precisely who the data controller is likely to be in this context (ie social media apps, reviews apps, note-taking apps and advertisement-funded games), the Guidance recognises and addresses the unique challenges faced by mobile app developers and, more specifically, those presented in ascertaining which party is the data controller for the relevant app. As is the case with any data protection issue, this will be fact dependent. Factors such as how the app is funded and where the app is hosted will be particularly relevant in the mobile app environment. For example, where an advertiser funds development of an app and uses personal data from that app to provide targeted marketing to the customer, that advertiser rather than the developer is likely to be considered the ‘data controller’ with regard to that app, on the basis that the advertiser will be responsible for determining how the customer’s personal information will be dealt with. Conversely, where an app uploads the personal information of customers onto a central server controlled by the developer in order to enable users to share information with each other, the developer will most likely be regarded as the ‘data controller’.  

The Guidance sensibly reminds developers that, whether they are the ‘data controller’ or not, they should nevertheless, as a matter of good practice, take steps to explain clearly to the customer how their personal information will be treated. The Guidance specifically states that where advertising supports an app, users must be informed by the developer accordingly and be given information relating to the analytics used by the app.  

Key provisions 

The Guidance focuses on five key aspects of data protection compliance in the mobile app context: (i) what data can be collected; (ii) how users should be informed and how their consent should be gained; (iii) how users will be given feedback and control; (iv) how data will be kept secure; and (v) how developers should test and maintain their app. 

What data can be collected? 

The ICO provides the following guidance on collecting data in the ‘least privacy-intrusive‘ way possible via mobile apps: 

  • Collect and process the minimum data necessary for the app to function as intended (eg, remove unnecessary metadata from images before uploading them and design apps that process information on the device itself as opposed to in a way that necessitates the transfer of data to a central server controlled by the developer, such as in the context of location apps).
  • Do not store data for longer than necessary.
  • Allow users to permanently delete their personal data.
  • Collect usage or bug data only with informed consent from the user or alternatively in anonymised form (in the case of anonymisation, the developer should ensure that the data is minimised prior to anonymisation). 

For the purpose of achieving the above goals, the Guidance suggests that consideration should be given at the early stages of the design process to the types of data that any given app might collect, how important that data is to the app’s purpose and, where the data is to be transmitted, the potential impact on users if any of the transferred data were to be misused. The Guidance also indicates that a full privacy impact and security assessment should ideally be carried out but these are not statutory requirements.   

How should users be informed and how should their consent be gained? 

In acknowledgement of the challenges faced by developers when attempting to present sufficiently detailed privacy information to mobile users on the small screen of a mobile device whilst maintaining a ‘user-friendly’ interface that accommodates the entire user experience, the ICO provides the following guidance relating primarily to the presentation of privacy notices: 

  • Privacy notices need not be contained in one large document, and can instead be integrated into the app using a layered approach, where the more important privacy points are summarised, with more detailed information available should the user choose to see it via easy-to-follow links.
  • Privacy information should be provided to the mobile user as soon as possible, preferably prior to download, however ‘Just-in-time‘ notifications could also be used immediately prior to the processing of personal data where necessary, in particular where the app is imminently about to download intrusive data such as GPS location.
  • There should be a clear explanation of which data will be processed and why it will be processed (if useful, developers are also advised to state what will not be done with the data). For this purpose, it is preferable to highlight any onerous actions so as not to mislead the mobile user and inform the user when the app passes the data on to any other organisations.
  • Plain English should be used, in language appropriate to the age range of the target audience.
  • If the developer acts as data controller, the user should be informed accordingly. Additionally and whether the developer is data controller or not, users should be given the developer’s contact details.  

As to the detailed content of the privacy notice itself, which must, in summary, inform the user of the app about how their personal data will be used if they install the app (whether by the developer or a third party), the Guidance directs developers to its ‘Privacy notice code of practice’. 

The key message is generally that developers must always be transparent as to the way that a user’s information will be processed, and this can be achieved by presenting the relevant information in as user-friendly a manner as possible.  

How users will be given feedback and control 

Where possible, users should be given a degree of control over the use of their data, including a choice as to which personal data will be collected by the developer, and the option to review and change their data privacy choices in one settings location. 

Data security 

The Guidance provides a high level overview of good security practice in relation to the design of both the app and the central servers that the app communicates. Key elements are set out below:  

  • Ensure, where possible, that passwords are salted and hashed on any central server. Inform users in the event that hashing is not possible (for example in password manager apps).
  • Always use encrypted connections for any sensitive information, including usernames, passwords and device IDs, and use encryption techniques where the app stores data for later use.
  • Use tried and tested cryptographic methods and codes for performing functions with well-established and re-usable implementations (such as app updates).
  • Particular caution should be taken to respect the sensitivity of data accessed from other apps.
  • Ensure that apps are sanitised appropriately when accepting input from other apps to prevent ‘inter-app injection flaws‘.
  • Where an app communicates with another party, ensure that the connection is secure by using the encryption provided by a Secure Sockets Layer (SSL) or Transport Layer Security (TLS) connection, in addition to verifying the other party’s identify by reviewing their certificate.
  • Ensure that any central server has a valid SSL/TSL certificate and that any central server allows strongly encrypted connections only. 

The Guidance recognises that different apps have different privacy and security risks, and recommends conducting security testing, vulnerability scanning, and more in-depth penetration testing accordingly, prior to rolling out the app.  

Testing and maintenance of apps 

The Guidance reminds developers that the obligation to test the privacy behaviour of their app is an ongoing process that should be conducted not only during development stage but also following any changes made to the app’s code. For example, developers should monitor and ensure that a decision to deny access has the desired effect and that data is retained no longer than the stated retention period, and they should continually seek to update their privacy controls in line with technological advancements. In the event that changes to an app require new data processing, users must be informed accordingly. Any security vulnerabilities should be dealt with swiftly, typically by informing users of the problem and issuing an amended and updated version of the app.  

Other compliance laws 

The Guidance directs app developers to other relevant UK laws and guidance notes and considers when they are likely to apply, including the Regulations (where an app uses cookies or is used for communicating, including for marketing purposes), the PhonepayPlus guidance (where an app uses premium rate services), the Consumer Protection from Unfair Trading Regulations (which concern best practice as to the treatment of consumers by businesses), the Unfair Terms in Consumer Contracts Regulations (which concern transparency and fairness in relation to consumer contracts), the OFT’s Principles for the use of Continuous Payment Authority (where an app accepts payments) and the OFT’s principles for developing online and app based games. 

Leaky apps: Who is to blame? 

Whilst the Snowden revelations and associated coverage indicated that mobile app developers may have been knowingly transferring data to, or collaborated with, organisations such as the NSA and GCHQ, other reports suggest that the NSA and GCHQ have simply developed superior capabilities for collection of data, well beyond the security measures that are or could be used by developers to counter their efforts.

Irrespective of the above, there is one recurring theme in the responses given to questions over ‘leaky apps’. GCHQ declined to comment on any specific programme, but reiterated that all of its activities complied with UK law. The NSA similarly relied on the legality of their actions (although President Obama has since announced new restrictions on government surveillance). When Rovivo Entertainment, the creator of the ‘Angry Birds’ app (reported to have been downloaded over 1.7 billion times) and expressly mentioned in the Snowden leaks, was criticised for selling data to mobile ad companies, it too claimed that its actions were lawful – although it firmly denied knowingly sharing data with the security services.

Given that all parties use lawfulness as an excuse for their actions, it is arguably the legal regime itself that has spawned this situation. The law under which the security services operate is beyond the scope of this article, however, it is trite to say that privacy law has failed to keep pace with technology and to note that the Data Protection Directive (on which the DPA is based) was drafted in the early 1990s. Privacy law is however changing and the proposed EU Data Protection Regulation may offer some assistance in protecting personal data in this regard by at least requiring data controllers to take further steps to ensure they have obtained an individual’s consent, ie by eliminating the option of relying on implied consent, requiring data controllers to bear the burden of proof for showing individuals’ consent and requiring consent to processing to be separated from other agreements.

Conclusions

Whilst the ICO Guidance represents a welcome step towards increased protection for mobile app users’ personal information in the rapidly growing mobile app industry, it is doubtful that even full compliance with its limited ‘security’ provisions (or the previous voluntary guidance for that matter) would have prevented the NSA or GCHQ (with their highly advanced techniques for obtaining information) from accessing such personal data in the first place. Moreover, even the additional protection provided by the proposed Data Protection Regulation is unlikely to restrict the security services’ ability to intercept mobile data. 

However, cumulatively (along with continued ICO scrutiny of this area) it is possible that the sharing of so much personal data from mobile devices in the first place may be discouraged. As suggested in the introduction to this article, this is probably a far more important step in protecting the privacy of individuals. The ICO’s clear message is that app developers must now take it upon themselves to comply with its recommendations in the most responsible way possible so as to protect an individual’s personal data. This might actually involve thinking beyond the high level provisions of the Guidance.  For example, in the context of security, developers will be expected to think carefully, from a design perspective, about precisely how they can prevent infiltration from outside organisations.  

Of course it is worth bearing in mind that users may not necessarily want the law to protect their data if this means restricting the utility of their apps. As the Guidance itself notes, ‘consumers’ expectations of convenience can make it undesirable to present a user with a large privacy policy, or a large number of prompts or both’. Perhaps these expectations of convenience may also make it undesirable for the law to make any substantive restrictions on what app developers can do with personal data. As the law, technology and public expectations develop during 2014 and beyond, we may begin to find out.

Sarah Pearce is a Partner at the London office of Edwards Wildman.

Jonny McDonald is an Associate there.

Katy Whitfield is a Trainee, also at the London office of Edwards Wildman.