Facial Recognition: the ICO Opinion

January 17, 2020

The Opinion sets out the ICO’s position and makes recommendations relating to the use of Live Facial Recognition by police forces, following an ICO investigation into how the police use this technology in public spaces. A report of the ICO investigation findings, together with the Opinion, were both published on 31 October 2019. Considering that this is the first Opinion issued by the Commissioner under section 116 and Schedule 13 of the Data Protection Act 2018, publication shows how seriously the ICO is taking this topic.

Live facial recognition technology processes biometric data in real time, allowing police to identify individuals as they pass facial recognition cameras, although non-live methods of identification, such as from older or still images are also utilised by the police. As it involves the processing of biometric data, live facial recognition is brought within scope of GDPR and the DP Act 2018, with police forces having to comply with Part 3 of the DP Act 2018 (law enforcement processing). The use of live facial recognition by law enforcement constitutes sensitive processing of biometric data under section 35 of the Act and as such is subject to greater safeguards under the Act. For example the police have to demonstrate that the processing is strictly necessary, which is a higher bar than merely necessary, and further that a Schedule 8 DP Act 2018 condition is met. The use of this technology by police forces in recent times has been on the increase, having been used at events where large crowds of people are expected such as football stadiums and the Notting Hill Carnival.

Whilst this Opinion, the investigation and its recommendations focus on law enforcement, many of the privacy issues that this developing technology raises will come into play when the use of this technology is being considered by the private sector. It is clear that there is a growing interest from private sector organisations in facial recognition type technology, from shopping centres wanting to identify known shoplifters or individuals with retail exclusion orders to bars and nightclubs wanting to identify persons of interest, and it is easy to see why its use may be attractive to some businesses. It is also clear that this technology is a regulatory priority for the ICO, who have indicated that they are also investigating its use outside of the policing sphere.

ICO investigation

The ICO report and Opinion follow a 17-month investigation into the use of live facial recognition by primarily South Wales Police and the Met Police Service. Aside from the ICO investigation, Facial Recognition Technology (FRT) has also featured regularly in the media over the past few months: from Kings Cross Estate using FRT for almost two years without the public being aware, to debate around the increasing use of FRT in shops and supermarkets.

During the course of the ICO’s investigation, the use of FRT by South Wales Police led to the case of R (Bridges) v Chief Constable of South Wales Police and Others [2019] EWHC 2341 (Admin). South Wales Police trialled live facial recognition technology in public spaces, in order to identify individuals who may be connected to criminal activity or at risk in some way, by scanning profiles and comparing against offender databases. Similar trials have been undertaken by other police forces, including the Met.

The technology processes the biometric data of potentially thousands of individuals who pass before the cameras. Some individuals will be stopped and spoken to by officers as a result, sometimes without justification, as the accuracy and effectiveness of the technology is questionable , particularly in relation to some ethnic groups. The action of South Wales Police was challenged by way of Judicial Review in the case of Bridges by a member of the public concerned about the lawfulness of the way his data had been processed whilst out shopping in Cardiff City Centre. The claimant was supported in his legal action by the civil liberties group Liberty.

The ICO has raised concerns over the invasiveness of the technology for some time. The Commissioner has blogged about her concerns relating to the unnecessary intrusion and potential detriment that could be caused from the use of the technology and the ICO intervened in the Bridges case as an interested party, making submissions to the Court. The High Court in Bridges did not consider that the processing was unlawful, rejecting the claimant’s arguments that the processing carried out by South Wales Police was not in accordance with the law or proportionate. However the claimant has publicly confirmed that they have commenced an appeal against the decision. South Wales Police have subsequently continued to use the technology, although the recent use at a football match between Swansea City and Cardiff City was met with opposition from some football supporters who turned up wearing masks and with banners in protest.

Support from the general public

Whilst the use of this technology is controversial, interestingly facial recognition does have relatively strong support from the general public. The ICO’s investigation report revealed that there was strong public support for the use of live facial recognition for law enforcement purposes, with 82% of those asked (of a group of over 2,200 sampled) being of the view that it was acceptable for the police to use the technology. The results were also strong where it was suggested that only one person was being located, with 60% of those asked agreeing that it would be acceptable to process the faces of a crowd even if it was to locate only one person of interest, and a similarly large number (58%) thought it would be acceptable to be stopped by the police if erroneously matched. It therefore appears that, based on these results, the general public are supportive of the use of the technology in policing, although the ICO investigation report acknowledges that there is other research which has shown that the picture is not consistent across different groups in society, with lower support amongst certain groups. It would be interesting to know how public opinion would compare if asked about use of the technology in the private sector.

ICO proposes a statutory code

It could be said that we are sleepwalking even further into a surveillance society, similar to the concerns raised by the ICO in relation to the increasing use of CCTV technology over a decade ago. With the lawfulness of South Wales Police’s use of FRT being confirmed by the High Court, subject to any appeal, it is likely to continue to be rolled out and increase in prevalence. One of the main recommendations in the Opinion is the Commissioner’s call for the strengthening of the legal framework in this area by the introduction of a statutory Code of Practice, to bring  greater clarity, foreseeability and consistency in relation to the use of the technology. The ICO recommends that the development of this Code be led by the Government, making reference to the Surveillance Camera Code as an example. A statutory code would no doubt be welcome by many as it would provide a clear framework for data controllers on how to carry out this processing in a way that is justifiable and proportionate, especially as this is one of the areas of developing technology where the legal framework and guidance is struggling to keep pace with the speed of technological advancement. Further guidance on the use of  FRT in the private sector would also be particularly helpful, as there is currently nothing specific  available  in this area.

Whilst the court in Bridges found the use of FRT to be lawful in that specific context there exists a strong public interest for the use of FRT to prevent and detect crime whereas some uses in the private sector will not necessarily justify the level of intrusion. The principles of necessity and proportionality are, therefore, key. In the Opinion, the Commissioner takes the view that whilst the High Court in Bridges found that the live facial recognition undertaken by South Wales Police was lawful, this should not be seen as a blanket authorisation to use the technology in all circumstances. Data controllers who are considering using facial recognition will also need to consider the lawful basis for processing carefully and determine the lawful basis before commencing processing. If consent is being considered as a potential basis, any power imbalance will, of course, be relevant – the ICO takes the view that it would be highly unlikely that consent would be a valid basis in the context of law enforcement.  

Appropriate policy document and DPIA needed

Organisations considering the introduction of this technology are advised to develop an appropriate policy document, setting out the justifications for use of the technology, in order to be able to demonstrate that its use is necessary and proportionate. In the investigation report the ICO indicates that further guidance on appropriate policy documents is in the process of being developed.

Organisations are further advised to carry out a thorough, well documented, data protection impact assessment (DPIA), to assess the impact that the processing will have on individuals and how these will be specifically addressed. The ICO’s investigation report sets out a number of areas where the police DPIAs, which were reviewed as part of the investigation, could be improved with more detailed consideration about matters such as strict necessity and the proportionality considerations. One suggested area of improvement was greater involvement of the DPO, particularly in the earlier stages of the process. The ICO recommends that in respect of law enforcement agencies DPIAs are provided to them in advance of roll out in order for early engagement with the regulator to take place.

A further recommendation of the ICO is in relation to the development of the technology’s algorithms to ensure that they do not contain any technical bias by treating certain groups less favourably and, to the extent that any technical bias may be present, that steps are taken to mitigate this. The ICO notes that any failure to address any such bias may have implications not only under the DP Act 2018 but also potentially for public bodies the Equality Act 2010. It will also be important to get fair processing information, including signage, right. Signage needs to be clear, individuals must be sufficiently informed of the processing being undertaken and data subjects should be are aware of how they can exercise their rights under the DP Act 2018 in relation to the processing. 

With FRT such a contentious issue, this will be an area the ICO is expected to keep a close eye on and it may only be a matter of time before formal enforcement action is taken by them. Certainly, given the warnings from the UK Commissioner on how concerned she is about this issue, any data controller planning FRT who is lax about  compliance with the DPA and GDPR  ought not to be surprised if they find themselves subject to an ICO investigation. 

The ICO won’t be the first to enforce on FRT however. In August 2019 the Swedish data protection authority in August issued a GDPR penalty on this matter. The case concerned a school that was piloting the use of FRT to monitor student’s attendance and save teacher time in taking student register. The failings in this case related to processing data in a more invasive manner than necessary (Article 5), processing sensitive personal data without a legal basis, consent not being valid (Article 9) and not complying with the requirements of DPIA and prior consultation with the Swedish DPA (Articles 35 and 36).

Use of facial recognition programmes falls under DP law

In its Opinion, issued on 31 October, the ICO says that data protection law applies to the whole process of live facial recognition (LFR), from consideration about the necessity and proportionality for deployment, the compilation of watchlists, the processing of the biometric data through to the retention and deletion of that data.

‘Controllers must identify a lawful basis for the use of LFR. This should be identified and appropriately applied in conjunction with other available legislative instruments such as codes of practice,’ the ICO says.

Based on the judgment in Bridges and the evidence gathered in the ICO investigation, it says that there is no basis for regulatory action.

While there is some evidence of processing good practice by both South Wales Police (SWP) and the Metropolitan Police Service (MPS), there are areas of data protection compliance where the MPS and SWP could improve practices, share lessons and reduce inconsistency.

As there is an increased risk of compliance failure and undermining public confidence, the ICO says forces and other law enforcement agencies are advised to consider the points made in the Commissioner’s opinion.

picture of Aaminah Khan, Barrister, St John’s Buildings

Aaminah Khan, Barrister, St John’s Buildings