ICO issues report warning of dangers of discrimination in neurotechnologies

June 8, 2023

The Information Commissioner’s Office has issued a new report which warns that newly emerging neurotechnologies risk discriminating against people if those groups are not put at the heart of their development.

The ICO predicts that the use of technology to monitor neurodata, the information coming direcrly from the brain and nervous system, will become widespread over the next decade.

Neurotech is already used in the healthcare sector, where there are strict regulations. It can predict, diagnose, and treat complex physical and mental illnesses, transforming a person’s responses to illnesses such as dementia and Parkinson’s disease. In May, Gert-Jan Oskam, a 40-year old Dutch man who was paralysed in a cycling accident 12 years ago, was able to walk again due to electronic implants in his brain.

However, neurotechnicals are rapidly developing for use in the personal wellbeing, sports and marketing sectors and also for monitoring people in the workplace. The report says that if these are not developed and tested on a wide enough range of people, there is a risk of inherent bias and inaccurate data being embedded in neurotechnology – having a negative impact on people and communities in the UK.

Discrimination in neurotechnology could occur where models are developed that contain bias, leading to inaccurate data and assumptions about people and communities.

The risks of inaccurate data emerge when devices are not trialled and assessed on a wide variety of people to ensure that data collection remains accurate and reliable.

Neurodivergent people may be particularly at risk of discrimination from inaccurate systems and databases that have been trained on neuro-normative patterns.

The use of neurotech in the workplace could also lead to unfair treatment. An example of this could be that if specific neuropatterns or information come to be seen as undesirable due to ingrained bias, those with those patterns may then be overlooked for promotions of employment opportunities.

As well as the issue of discrimination, the ICO report covers other regulatory issues such as:

  • Regulatory definitions – personally identifiable neurodata is always considered to be personal information irrespective of purpose. However there is no explicit definition of neurodata as either a specific form of personal information or special category data under the UK GDPR Organisations must be clear about when complex processing involves processing biometric data and the situations when biometric data is special category data.
  • Consent, neurodata and appropriate bases of processing: when using neurodata that does not meet the threshold for special category data, organisations must still identify a lawful basis for processing personal data under Article 6 of the UK GDPR. Potentially a relevant bases organisations should consider for commercial purposes are consent, legitimate interest and performance of a contract; and
  • Closed-loop processing (this is where devices use automated algorithmic processing that assesses personal information in the form of electrical patterns from the brain. They take automated action unprompted by the user and without significant human intervention) poses heightened risks around shifts in purpose and in automated processing.

The ICO has developing specific neurodata guidance in the medium term. It will consider the interpretation of core legislative and technical neurotechnology definitions, highlight links to existing ICO guidance, its views on emergent risks and provide sector-specific case studies to highlight good practice by 2025.

The ICO will also address some other issues elsewhere, as it builds on its AI framework and forthcoming guidance on workplace surveillance. This will include potential neurodiscrimination arising through inaccurate information or inappropriate processing and decision-making.