Explaining decisions made with AI: ICO and the Alan Turing Institute guidance published

Guidance consists of three parts to offer practical advice and tools on how to explain decisions made with AI to consumers

The ICO and the Alan Turing Institute have issued their joint guidance on how to explain decisions made with AI, following a consultation late last year.

Increasingly, organisations are using artificial intelligence to support, or to make decisions about individuals. The guidance aims to give organisations practical advice to help explain the processes, services and decisions delivered or assisted by AI, to the individuals affected by them.  

Depending on someone’s level of expertise, and the make-up of an organisation, some parts of the guidance may be more relevant than others. It is not a statutory code of practice under the Data Protection Act 2018 but aims to provide information that will help organisations comply with a range of legislation, and demonstrate ‘best practice’.

The guidance consists of three parts:

Part 1 – The basics of explaining AI

Part one defines the key concepts and outlines a number of different types of explanations. It will be relevant for all members of staff involved in the development of AI systems.  It deals with definitions, the legal framework, benefits and risks, what goes in an explanation, what the contextual factors are and the principles to follow.

Part 2 – Explaining AI in practice

Part two aims to assist with the practicalities of explaining these decisions and providing explanations to individuals. This will primarily be helpful for the technical teams in an organisation.  However, DPOs and compliance teams will also find it useful.  It provides a summary of the tasks to undertake, which are:

  • Select priority explanations by considering the domain, use case and impact on the individual;
  • Collect and pre-process data in an explanation-aware manner;
  • Build systems to ensure the organisation is able to extract relevant information for a range of explanation types;
  • Translate the rationale of the system’s results into useable and easily understandable reasons;
  • Prepare implementers to deploy the AI system; and
  • Consider how to build and present explanations.

Part 3 – What explaining AI means for an organisation.

Part three covers the various roles, policies, procedures and documentation that can be put in place to ensure an organisation is set up to provide meaningful explanations to affected individuals. This part is primarily targeted at an organisation’s senior management team but DPOs and compliance teams will also find it useful.

Published: 2020-05-22T11:00:00

    Please wait...