In the second in our series of articles on Smart Cities, Lachlan Urquhart and Dr Ewa Luger address the disconnect between DP regulation and systems design. Their research in this area may have identified one route to ‘creative compliance’.
Whilst smart cities are an amorphous concept, one thing remains constant: the underpinning systems are reliant on real time data collection, convergence and use. Making a city 'smart' requires sensors embedded in the urban environment, often in ways not immediately visible to the citizens. The resulting vision is that of a more 'aware' and responsive city, enabling greater knowledge of 'real world' movements, patterns and routines. Such information has the potential to improve urban and civic services from the citizen/consumer perspective, and to enable better management of non-discretionary goods such as energy. The smart grid is a good example, where knowledge of everyday energy demand and usage can assist sustainability at the macro level through more efficient production and energy supply and at the micro level through systems that nudge consumers towards more efficient use, like home energy displays.
Whilst such visions of the future have attracted both celebration and critique, our focus is on the scale of data processing, which presents a very practical set of regulatory challenges. Many smart city initiatives pull together data streams from different contexts into repositories, conduct analytics and publicly release the results for use in transportation, energy and civic engagement products/services, to name but a few. Yet managing large data streams, both at point of collection and in use, requires careful planning, especially so when personal data is involved. As recently stated in the UK ICO Report on Big Data, primary privacy concerns centre around satisfying subject access requests, increasing transparency of data flows to the data subject, and managing the repurposing of data to ensure compatibility with the original purposes of collection. With the consent of data subjects, we have similar concerns about how individuals agree to data processing if they don't know it is occurring. Yet recently, the ICO hinted that the 'legitimate interest' grounds for legality of processing big data might be sufficient for businesses, bypassing the need for consent. 
Controlling who can access personal data, for what purposes, how long they keep it, where, and how it is secured are all classic data protection (DP) governance questions; and system designers have long played an important role in this regard. But the ambient, invisible nature of physically embedded technologies has a different character to web-based systems. Minimal user awareness, and insufficient transparency of the technical functions are two issues. As more technologies become 'black boxes', the challenge of adequately informing citizens becomes greater, as in the case of obtaining informed consent. Raising citizen awareness of how smart public space CCTV works, where data from smart transport cards is shared or how their home energy data helps energy grid operators understand daily routines is important. Smart cities may create benefits but they equally pose many risks, especially if awareness by users is low and respect for their concerns is not addressed.
From a legal perspective, design is increasingly becoming the regulatory tool of choice. Whilst it has for some time been recognised as best practice, the European Parliament may now in Article 23 of the LIBE amended EU General Data Protection Regulation (GDPR) legally mandate privacy by design (PbD). Article 23 states the need for technical and organisational measures that comply with the Regulation and protect data subject rights by reflecting technical knowledge, best practices, and risks to data subjects. The 'entire lifecycle management of personal data' should be considered 'from collection to processing to deletion, systematically focusing on comprehensive procedural safeguards regarding the accuracy, confidentiality, integrity, physical security and deletion of personal data'. Proportionate data collection, purpose limited retention, and increased subject control over data distribution are also key considerations.
To whom these obligations fall is not tightly defined in the legislation. However, given that the law invoked the notion of design, it seems designers have a key role to play here. As with many professions, the term 'designer' is a loose catch-all implicating a spectrum of actors from computer programmers and system architects to interface design specialists, or more likely some hybrid of these. The recent Article 29 Working Party Opinion 8/2014 on the Internet of Things (IoT) highlighted a range of actors in the IoT supply chain (many of whom could be deemed designers) that may handle personal data. Specifically, they list operating system and device manufacturers, standardisation bodies and data platforms to owners, social platforms and app developers. Such actors have DP obligations, such as (a) always to use privacy impact assessments (PIAs) before releasing new applications, (b) to apply PbD and Privacy by Default principles, (c) to provide users with self-determination and control over their data, and (c) to employ 'user friendly' approaches for engaging with users.
However, within the design process of a new service or system, regulatory issues are often considered late in the cycle. Earlier reflection on compliance requirements can enable solutions to be built into the system from the start, instead of relying on adjustments and alterations after a technology has already been built. With ambient systems this is critical as, unlike web-based services, these technologies have a spatial context where they sense the real world environment and physical behaviour of citizens. For example, how the layout/design of an urban space affords user movement may shape behaviour more than on-going notifications about where you can or cannot go (as we see in the work of situational crime prevention). Considering regulation, and especially DP regulation, at the earliest possible stage opens up more creative approaches to compliance. That being said, the practicalities are a little more complex. Despite their new regulatory role, designers are not lawyers. The main problem they face is a lack of effective tools to provide designers with an understanding of the law and regulation, without having to become legal experts. Quoting complex legal doctrines, principles and data protection legislation to designers is of little use in forming meaningful dialogue between the two communities. Designers need tools to translate law into a more accessible format, whilst making use of a form or mechanism with which designers are already familiar.
Ideation Cards as a Compliance/Awareness Tool
One method to improve understanding is the ideation card approach - which has a long history within design. IDEO were pioneers of its use in engineering, and Batya Friedman's work on value sensitive design explored how we might look beyond concerns of performance or interface intuitiveness in HCI, to human values more generally. Ideation cards have been used widely to ground constraints, requirements and values in a tangible, operational way, ensuring reflective practice in systems design. Drawing upon this heritage, our team from the University of Nottingham and Microsoft Research created Data Protection Ideation Cards. These aim to empower designers in their new role by informing them of emerging changes in DP law on their own terms.
In our work we looked more broadly at four issues of the GDPR, namely the right to be forgotten/erasure (Article 17), 24-hour data breach notification requirements (Article 31), the need for explicit informed consent to process personal data (Article 4) and more broadly the concept of privacy by design described above (Article 23).
A range of EU DP law experts helped inform the text used within the cards. We then tested our cards with focus groups of designers holding various levels of experience. We asked them to think about how they would design a specific system, for example a public targeted advertising touch screen, a smart car that rewards good driving, and a smart energy home management system. We also prescribed various constraints that might restrict the design (eg low costs or energy) and the target user (eg elderly users, children) before introducing the DP law cards.
Firstly, we found concerns from designers around how much knowledge of DP law they needed to have. DP knowledge generally existed, but specific insight was patchy and needs driven in response to specific projects. Our 'designers' drew from multiple disciplines, ranging from conceptual and aesthetic to technical or functional design, highlighting the hybrid nature of this community. Designers for whom user interaction was a primary concern, such as user interface specialists (also largely from a more art-based background), were more open to engaging with DP law and developing creative solutions. System architects (with a programming/technical background) were more focused on technical functionality, and somewhat less concerned with user protection at the point of design, seeing it as something to be addressed at the end. When tasked with designing for explicit consent, all designers reviewed their system and sought to limit flows of data as a default, increasing user control by favouring local data storage solutions as opposed to automatic upload to the cloud.
Secondly, the user was, in the majority of cases, of principle concern yet designers broadly saw their role of protecting users as different from compliance with law. By leaving 'lawyers' to worry about regulation, designers are free to imagine a system purely in terms of functionality.
Thirdly, they saw compliance with law as limiting the design of their systems. According to one participant, "When you're designing a system, at least for me, you always think of regulation as an afterthought. So, if I get what I want then I see how do I protect the user afterwards" (systems architect). What was clear to us was that the term 'compliance' holds negative connotations. Our participants saw regulation as either the stick with which to retrospectively beat the system into shape, or the means by which one protects oneself from future litigation.
Building a meaningful dialogue
Arguably, the above comment typifies the set of issues we need to address. The legal and design communities are disparate. They work separately, use very a different lexicon, focus upon different priorities and are subject to vastly different time-scales. There seems, on the surface, no obvious point of collaboration. It may appear that the road to building meaningful dialogue is a long one, however we know that each journey begins with a single step. We hope that our cards are only one of a raft of future tools that will guide us towards a culture of 'creative compliance'. Whatever forms the visions of smart cities take, regulation will be a central consideration, and we need to ensure our communities find better ways of working together so we can more effectively address the challenges ahead.
Dr Ewa Luger is a postdoctoral researcher in the Human Experience and Design group at Microsoft Research Cambridge and the Microsoft Research Fellow at Corpus Christi College, University of Cambridge.
 A29 WP "Opinion 8/2014 on the Recent Developments on the Internet of Things" 14/EN WP 223 - http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf
 Project also included Prof Tom Rodden and Mike Golembewski, both of University of Nottingham. Paper is E Luger, L Urquhart, T Rodden and M Golembewski – "Playing the Legal Card: Using Ideation Cards to Raise Data Protection Issues within the Design Process" SIGCHI 2015). Downloadable version of the cards too at URL http://www.designingforprivacy.co.uk/download.html