GDPR: Data Protection Impact Assessments

Olivia Whitcroft takes a careful look at DPIAs in light of the further guidance on the GDPR that has been issued since its finalisation

Early in 2016, Computers & Law (Vol 26, Issue 6) featured my article on ‘Privacy Impact Assessments and the GDPR‘. Back then, my life was about preparing for dinner parties and excitedly examining the text of the brand new EU General Data Protection Regulation. Now, 18 months later, things have moved on. Firstly, my dietary risk assessments are no longer focused on dinner parties; they now revolve around what my eight-month-old baby should or should not be eating. On the list of high-risk foods are grapes, and processing activities include putting them in a blender to mitigate this risk. 

Secondly, with under a year to go until the GDPR applies, we now have more guidance on what the requirements for data protection impact assessments (DPIAs) mean in practice. The Article 29 Working Party has published Guidelines on DPIAs and determining whether processing is ‘likely to result in a high risk’ (WP248) (DPIA Guidelines). These were published on 4 April 2017 and were open for comments until 23 May 2017. At the time of writing, no updated version has been published.

The Working Party has also produced Guidelines on Data Protection Officers (WP243) (DPO Guidelines). These were originally published on 13 December 2016, and a revised version was published on 5 April 2017. They include guidance on the meaning of ‘large scale’ and ‘systematic’ processing, which assists in identifying when a DPIA is needed (as well as when a DPO is required).

In addition, the Information Commissioner's Office has published a discussion paper on profiling and automated decision-making (ICO Discussion Paper), which are activities for which a DPIA may be required. The paper was published in April 2017 and the feedback will inform the ICO's input into the drafting of EU guidance in this area.

What is a high risk activity?

We know from the text of Article 35(1) of the GDPR that DPIAs need to be carried out for high risk activities, including the three areas specified in Article 35(3). These are, in summary: (a) systematic and extensive evaluation of personal data (including profiling) and on which decisions affecting individuals are based; (b) large-scale processing of sensitive personal data; and (c) systematic and large-scale monitoring of publicly accessible areas. But what specific activities are captured by sub-sections (a) to (c), and what otherwise constitutes a ‘high risk’?

The ICO Discussion Paper gives examples of activities falling under sub-section (a):

  • profiling and scoring for the purposes of risk assessment (eg credit scoring, insurance premium setting, fraud prevention, detection of money laundering);
  • location tracking, for example by mobile apps, to decide whether to send push notifications;
  • loyalty programmes;
  • behavioural advertising; and
  • monitoring of wellness, fitness and health data via wearable devices.

This would also capture partially automated processing, and therefore a DPIA may still be needed even if there is some human involvement in the process.

Sub-section (b) captures processing on a large scale of data about racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, health, sex life, sexual orientation, as well as genetic data, biometric data for unique identification and data relating to criminal convictions and offences. The DPO Guidelines provide guidance on how to interpret ‘large scale’: it should take into account the number of individuals, the volume of data and range of data items, the duration of the activity and the geographical extent of the activity.

Unfortunately, they do not provide guidance on specific number ranges. We are therefore still facing uncertainty on whether the numbers of individuals and sensitive data items involved in a project are sufficiently great to be ‘large scale’. The DPO Guidelines envisage the development of standard practice for interpretation in more specific or quantitative terms for certain types of common processing activities.

Examples of activities which do constitute large-scale processing are:

  • processing of patient data in the regular course of business by a hospital; and
  • processing of personal data for behavioural advertising by a search engine.

Examples of activities that do not constitute large-scale processing are:

  • processing of patient data by an individual physician; or
  • processing of client data on criminal convictions and offences by an individual lawyer.

Of course, for the purposes of DPIAs, organisations still need to consider whether these activities constitute high risk for a specific project (on the basis of them being large scale or otherwise).

The DPO Guidelines are also useful for interpreting sub-section (c). As well as considering ‘large scale’, they discuss the meaning of ‘systematic’: occurring according to a system; pre-arranged, organised or methodical; taking place as part of a general plan for data collection; or carried out as part of a strategy.

The DPIA Guidelines provide that a piazza, shopping centre, street or public library are examples of a ‘publicly accessible area’, which indicates sub-section (c) is intended to capture monitoring of a physical space (for example using CCTV, drones or body-worn devices). However, monitoring online may also be considered high risk, as discussed below.

In addition to sub-sections (a) to (c), the DPIA Guidelines set out a list of criteria to be considered in determining whether an activity is likely to result in a high risk. Organisations may wish to draw on these when preparing screening questions or checklists for business teams to complete as part of project initiation.

As a rule of thumb, if the activity meets two or more of these criteria, a DPIA will be required. This rule may be overturned by the context – an activity meeting only one criteria could still carry a high risk, and an activity meeting two or more criteria may be of lower risk. It is therefore important to assess the substance, and document reasons for the decision.

The criteria are:

  • Evaluation or scoring, including profiling and predicting. Examples include a bank screening its customers against a credit reference database, a biotechnology company offering genetic tests to assess and predict health risks, and a company building behavioural or marketing profiles based on use of its website.
  • Automated-decision making with legal or similar significant effect. This includes where the processing may lead to exclusion of or discrimination against individuals.
  • Systematic monitoring. This covers processing used to observe, monitor or control data subjects, and is wider than (but includes) monitoring of publicly accessible areas listed in Article 35(3).
  • Sensitive data. As well as use of special categories of data and data relating to criminal convictions, this includes other data which can increase the risks to individuals, such as electronic communications data, location data, financial data, and information produced for personal activities when using online services (such as email or document management services).
  • Data processed on a large scale. This is wider than the specific large-scale activities listed in Article 35(3).
  • Datasets that have been matched or combined. This includes combining data used for different purposes or from different organisations.
  • Data concerning vulnerable data subjects. Such data subjects may include employees, children, the mentally ill, asylum seekers, the elderly, patients, and other individuals where there is an imbalance in the relationship between them and the organisation.
  • Innovative use or applying technological or organisational solutions. Examples include combining use of fingerprint and face recognition for improved physical access control, and Internet of Things applications.
  • Data transfer across borders outside the European Union. This should take into consideration the country of destination, the possibility of further transfers or the likelihood of transfers based on derogations.
  • Processing which prevents data subjects from exercising a right or using a service or a contract. This includes processing performed in a public area that people passing by cannot avoid.

The Guidelines go on to give examples of when a DPIA would be required, using these criteria:

  • A hospital processing its patients' genetic and health data, as it involves sensitive data and vulnerable individuals. This may also be large scale, as discussed above.
  • The use of a camera system to monitor driving behaviour, including video analysis to single out cars and automatically recognise number plates. This involves systematic processing and technological solutions.
  • A company monitoring its employees' activities, including their work station and internet activity. This involves systematic monitoring and vulnerable data subjects.
  • The gathering of public social media profiles to be used by companies to generate profiles for contact directories. This involves evaluation and large-scale processing.

When is a DPIA not required?

A DPIA is not required for projects where a high risk is not likely. An initial risk assessment will need to be carried out (for specific activities or categories of activity) to determine this. Examples within the DPIA Guidelines of where a DPIA is unlikely to be needed are:

  • An online magazine using a mailing list to send a generic daily digest to its subscribers. This does not obviously involve any of the criteria listed above.
  • An e-commerce website displaying adverts for vintage car parts involving limited profiling based on past purchases behaviour on certain parts of its website. This involves evaluation, but is not systematic or extensive.

The ICO may also publish a list of activities for which no DPIA is required (under Article 35(5)), though these may be subject to other compliance rules or guidelines.

Article 35(10) also contains an exception for regulated activities carried out pursuant to a legal obligation or public interest. A DPIA may not be required if one has already been carried out as part of setting the legal basis for those activities. Unfortunately, the DPIA Guidelines do not provide any examples of when this applies.

Even where a DPIA is not required under Article 35, the general principle of ‘accountability’ and obligations under Articles 24 and 25 should still be applied. These require data protection by design and by default, and the ability to demonstrate compliance with the GDPR, taking into account the risks involved. Therefore, whether or not in the form of a DPIA, some level of risk and compliance assessment should be carried out for all new systems or activities involving personal data. The results of these will also assist in preparation and maintenance of records of processing activities under Article 30.

One DPIA covering several similar projects

Article 35(1) allows a single DPIA to be used for similar activities that present similar high risks. This means that similar projects in different parts of an organisation, or at different times, or even by different parties, could be covered by the same DPIA. This makes practical sense as there is no need to re-invent the wheel each time.

For example, if you regularly carry out similar direct marketing campaigns involving profiling, one DPIA could address the risks for all such campaigns. Or, if several organisations are using similar technology, one DPIA could assist all of them, or they may be able to draw on an assessment undertaken by the technology provider. The DPIA Guidelines give a couple of examples:

  • a group of municipal authorities each setting up a similar CCTV system;
  • a railway operator with video surveillance at all its train stations.

Questions posed at project initiation can seek to identify such similar activities within or outside of the organisation. Each controller has its own responsibilities so should, of course, assess whether any previous DPIA is sufficient to cover their specific needs and risks.

Existing processing operations as at 25th May 2018

The GDPR requires a DPIA to be carried out prior to the relevant processing operations, for projects initiated after the GDPR applies, from 25 May 2018. However, the DPIA Guidelines strongly recommend DPIAs for processing operations already underway at that date. If you already have a DPIA process up and running, it would in any case be wise to meet GPDR standards, unless the relevant project is very short-term.

In addition, where there is a significant change to an existing activity (eg changes to technology or the purposes of data use), this may in itself require a DPIA.

The DPIA Guidelines recommend that, in any case, existing projects are reviewed within three years of May 2018, consistent with reviews of DPIAs, as discussed below.

What methodology should be used for a DPIA?

The DPIA Guidelines confirm there is no fixed methodology for conducting a DPIA, and that organisations have flexibility to determine the precise structure and form to fit with existing working practices. However, there are minimum features which a DPIA should include, as defined by Article 35(7):

  • a description of the envisaged processing operations and the purposes of the processing;
  • an assessment of the necessity and proportionality of the processing;
  • an assessment of the risks to the rights and freedoms of data subjects; and
  • the measures envisaged to address the risks and demonstrate compliance with the GDPR.

Those already following the ICO's Code of Practice on Privacy Impact Assessments can take comfort that it is listed within Annex 1 of the DPIA Guidelines as a potential framework for DPIAs. The Guidelines also encourage development of sector-specific frameworks.

Annex 2 of the DPIA Guidelines contains a list of criteria to assess whether or not a particular DPIA or DPIA methodology is sufficiently comprehensive to comply with the GDPR, including the minimum features set out above.

What are the risks which need to be assessed?

The risk assessment should focus on risks to individuals. Of course, these risks may lead to associated risks for the organisation, such as non-compliance, financial penalties and legal action. The DPIA Guidelines state that risks primarily relate to rights of privacy, but may also involve other fundamental rights such as freedoms of speech, thought and movement, prohibition of discrimination, and right to liberty, conscience and religion.

The level of risk should take into account both the severity of the impact, and the likelihood of such impact occurring. So, for example, the consequences of a loss of sensitive data may be more severe than with a loss of other data, and a system subject to minimal access controls may lead to a higher likelihood of loss than a system with substantial access controls. Taking both these factors into account, an overall level of risk may be determined.

Consultation with data subjects

Article 35(9) of the GDPR requires consultation with data subjects or their representatives where appropriate.  Unfortunately, we still have little guidance on the interpretation of ‘where appropriate’, but the DPIA Guidelines do consider how individuals' views could be sought. They suggest internal or external studies, or formal questions or surveys sent to staff, customers or trade unions. Reasons should be documented if the final decision differs from the views of data subjects, or if the views of data subjects are not sought.

Consultation with the ICO

Fortunately, the DPIA Guidelines provide some clarity on when consultation with the ICO is required. The good news from the perspective of project costs and timetables (and maybe also the ICO's costs and timetables) is that the requirement is not as wide as the potential interpretation I outlined in my previous article. The DPIA Guidelines indicate that consultation is needed where there is a high ‘residual’ risk; in other words a high risk that has not been appropriately mitigated during the DPIA. This may arise, for example, where the only identified solutions would compromise the aims of the project.

UK law may also require consultation with the ICO in relation to a task carried out in the public interest, including processing in relation to social protection and public health.

Ongoing reviews

The DPIA Guidelines highlight that carrying out a DPIA is a continual process, not a one-time exercise. The DPIA should be updated throughout the design and implementation of the project, and then reviewed during the lifecycle of the project.

Article 35(11) of the GDPR, in particular, requires a review to assess if processing is performed in accordance with the DPIA, at least when there is a change in the risk involved. Risk profiles may be impacted, for example, by changes to the project, such as the extent of the data used, or the purposes of use; or by external factors, such as expectations or concerns of individuals, advances in technology, or legal decisions and guidance. Examples within the DPIA Guidelines include where the effects of certain automated decisions have become more significant, new categories of individuals become vulnerable to discrimination, or the data is intended to be transferred to a country which has left the EU. On this latter point, Brexit may affect the risks for many activities.

The DPIA Guidelines suggest that DPIAs should be re-assessed after three years, perhaps sooner, depending on the nature of the processing, the rate of change in the processing operation and the general circumstances.

Roles and responsibilities

It is the controller's responsibility to carry out a DPIA, though data processors involved with the activities should assist in accordance with their contract with the controller (under Article 28(3) of the GDPR).

The data protection officer (DPO) (if one is appointed) must provide advice and monitor the performance of the DPIA (under Articles 35(2) and 39(1)), and should act as the contact point for consultation with the ICO (under Article 39(1)).

The DPO Guidelines recommend that advice should be sought on whether or not to carry out a DPIA, what methodology to follow, whether to carry it out in-house or whether to outsource it, what safeguards to apply to mitigate the risks, whether or not the DPIA has been correctly carried out, and whether its conclusions comply with the GDPR. These tasks should be clearly outlined, both in the DPO's contract and within information provided to employees, management and other stakeholders.

The DPO's advice, and the decisions taken, should be documented, and any departure from the DPO's advice should be justified.

The DPIA Guidelines also suggest that other specific responsibilities should be defined, including those of specific business units, independent experts, the Chief Information Security Officer and/or the IT department.

Publication of a DPIA

Whilst publication of a DPIA is not a legal requirement, the DPIA Guidelines suggest that publication of at least a summary should be considered in order to help foster trust in the processing operations, and to demonstrate accountability and transparency. This may be particularly good practice for public authorities and where members of the public are affected.

Additional guidance

We are still awaiting other resources which may assist in implementing DPIA requirements. These include the ICO's lists of processing activities which will require a DPIA (required under Article 35(4)) or which will not require a DPIA (optional under Article 25(5)). The DPIA Guidelines are intended to inform the preparation of these lists. We also do not yet have clarity on codes of conduct which, if complied with, may be taken into account in assessing data protection impacts (under Article 35(8)). There may also be updates to the DPIA Guidelines following the (now expired) consultation period.

What to do now

Just as violent rejections of broccoli and the disappearance of yummy items from my plate are now integrated into my mealtime procedures, tailored processes for identifying whether DPIA is needed, assessing data protection risks, and follow-up reviews can be integrated into project management procedures. For those new to DPIAs, it is time to start testing out methodologies which will work for you, taking into account the high risk criteria and risk assessment guidance. Organisations which already have PIA or DPIA processes should ensure their assessments meet GDPR standards, and consider how they will build in additional procedural steps, such as ICO and DPO involvement.  DPIA practices will then be ready for action by 25 May 2018.

Olivia Whitcroft is the principal of OBEP (www.obep.uk), a law firm specialising in technology contracts, data protection and intellectual property. Contact: [email protected], @ObepOlivia

Computers & Law continues to seek articles on GDPR and we are always looking for new authors.  See https://www.scl.org/about/contributing for more information.

    0 comments

      This site uses cookies. By using the site you agree to our use of cookies as set out in our Privacy Policy.

      Please wait...