ICO issues opinion on use of live facial recognition technology in public places

June 22, 2021

The ICO has now published an opinion on the controversial use of live facial recognition technology in public places. Facial recognition technology (FRT) relies on the use of people’s personal data and biometric data. Data protection law therefore applies to any organisation using it. Live facial recognition (LFR) is a type of FRT that often involves the automatic collection of biometric data. This means it has greater potential to be used in a privacy-intrusive way. 

The ICO previously published an Opinion on the use of LFR in a law enforcement context. It concluded that data protection law sets high standards for the use of LFR to be lawful when used in public places. The ICO has now assessed and investigated the use of LFR outwith law enforcement. This has covered controllers who are using the technology for a wider range of purposes and in many different settings.

This work has informed the ICO’s view on how LFR is typically used today, the interests and objectives of controllers, the issues raised by the public and wider society, and the key data protection considerations. The ICO has therefore published the opinion to explain how data protection law applies to what it calls a complex and novel type of data processing. It also explains why biometric data is more sensitive than normal data.

The ICO has investigated 14 specific deployments of LFR as well as wider use overseas. Controllers often use LFR for surveillance purposes, aiming to prevent crime or other unwanted behaviours in physical retail, leisure and transport settings or other public places. LFR can identify particular individuals entering the premises and allow the controller to take action (for example, removing the individuals from the premises concerned). The ICO has also seen an increasing appetite to use LFR for marketing, targeted advertising and other commercial purposes. This can involve using an individual’s biometric data to place them in a particular category. 

In the longer term, the technology has the potential to be used for more advanced practices. This could include integration with big-data ecosystems which combine large datasets from multiple sources such as social media. 

The ICO has identified the following key data protection issues:

  • the governance of LFR systems, including why and how they are used;
  • the automatic collection of biometric data at speed and scale without clear justification, including of the necessity and proportionality of the processing;
  • a lack of choice and control for individuals;
  • transparency and data subjects’ rights;
  • the effectiveness and the statistical accuracy of LFR systems;
  • the potential for bias and discrimination;
  • the governance of watchlists and escalation processes;
  • the processing of children’s and vulnerable adults’ data; and
  • the potential for wider, unanticipated impacts for individuals and their communities.

The opinion considers how LFR might comply with data protection law, in particular, the UK GDPR and the Data Protection Act 2018.

The ICO says that it will continue to investigate and advise on the issue. This includes completing investigations already underway, assessing data protection impact assessments which identify high-risk processing, conducting a proactive audit of LFR systems in deployment, and, where appropriate, support data protection codes of conduct or certification schemes. The ICO has also made recommendations for technology vendors and the wider industry.

In considering any regulatory action or use of her enforcement powers, the ICO says that it may refer to the Opinion as a guide to how it interprets and applies the law. Each case will be fully assessed on the basis of its facts and relevant laws. The ICO may update or revise the Opinion based on any material legal or practical developments, such as judicial decisions and case law, or further findings from regulatory work and practical experience.