ICO publishes guidance on AI decision making

Published on 02 November 2020

How can companies comply with data regulation when using AI to make decisions affecting individuals?

The key takeaway
Guidance has been issued by the ICO on how best to ensure your AI systems are compliant with the GDPR requirement that decisions made are explainable.

The background
The ICO recently published guidance – Explaining decisions made with AI to assist organisations with their explanations of how they use AI. The guidance is not intended to be exhaustive, nor is it a binding authority, but it aims to be a useful tool for compliance teams, data protection officers, and senior management by providing practical advice on data protection compliance.

The guidance
The guidance is split into three sections.  

The first section: This describes the basics of explaining AI. The ICO identifies four principles to guide organisations on making decisions explainable:

  • be transparent
  •  be accountable
  • consider the context you are operating in
  • reflect on the impact of the AI system on the individuals affected, as well as wider society.

The guidance then goes on to identify six different ways of explaining AI decisions:

  • Rational explanation – explain the reasons which led to the decision, delivered in an accessible and non-technical way
  • Responsibility explanation – describe who is involved in the decision, who is accountable, and who to contact for a human review of the decision
  • Data explanation – explain what data was used by the AI in coming to the decision; in some cases it may also be necessary to provide more details of the decision itself eg where an individual has been placed in a particular category and does not understand why
  • Fairness explanation – describe the steps taken to ensure an AI system’s decisions are fair. Be sure to include fairness considerations at all steps of the process, from the design of the AI to the selection of data used
  • Safety and performance explanation – explain the steps taken to make the AI system perform as accurately, reliably, securely and robustly as possible
  • Impact explanation – describe how the AI system monitors and accounts for all potential impacts its decisions could have.

The ICO goes on to explain the contextual factors that organisations should bear in mind when providing explanations: domain (ie setting or sector of the AI system), data, impact, urgency, and audience.

The second section: This goes through the practicalities of explaining AI decisions to individuals and is primarily aimed at the technical teams of organisations. It provides a list of tasks that, when followed, assist in creating an AI which will provide more easily explainable decisions.  The ICO recommends that any approach should be informed by the importance of implementing the principles of transparency and accountability into the AI systems.

The third section: This is aimed primarily at senior management and outlines the roles and responsibilities of those involved in the explanation process.  General guidance is provided on what sorts of policies should be in place, and loosely describes what those policies might look like. For example, a data collection policy would detail the need to consider how decisions could be explained at every stage of the development of an AI system.  A list of recommended documentation is provided, which if followed will provide evidence to demonstrate the explainability of an organisation’s AI systems, and form an ‘audit trail’ of explanations provided to individuals.

Why is this important?

The explainability of AI decisions is crucial to GDPR compliance, and the guidance is pretty much essential reading for anyone engaged in developing AI systems.

Any practical tips?

  •  Have your technical teams review the second section of the guidance and consider whether your current systems comply. Can they amend their processes to follow the list of suggested tasks provided by the ICO?
  • Draft (or if already drafted amend) the policies and documentation listed in the third section of the guidance. This describes what the policies should be trying to achieve and includes useful templates eg for documenting processing activities.

 Autumn 2020

Stay connected and subscribe to our latest insights and views 

Subscribe Here