Project Aims

  • We will identify various types of meaningful explanations for algorithmic decisions in relation to their purposes, categorise them against the legal requirements applicable to UK businesses relating to data protection, discrimination and financial services.
  • We will conceive explanation-generating algorithms that process, summarise and abstract provenance logged by automated decision-making pipelines.
  • An Explanation Assistant tool will be created for data controllers to provision their applications with provenance-based explanations capabilities.
  • Throughout the project, we will engage with partners, data subjects, data controllers, and regulators via interviews and user studies to ensure the explanations are fit for purpose and meaningful.

Research Hypotheses

  1. there is a variety of explanations with different purposes that can address the requirements of diverse legal and governance frameworks (in particular depending upon the sector at stake)
  2. a useful set of explanations can be constructed adequately by computational means,
  3. the provenance logged by automated decision pipelines according to well-defined rules provides a good input to novel algorithms to generate these explanations.

Objectives

O1. To survey and classify the types of meaningful explanations, their purposes, and the governance framework to which they belong.

O2. To identify categories of computationally-tractable explanations contextualised in legal requirements and to conceive novel algorithms that process provenance logged by automated decision-making pipelines to generate explanations, by combining explainable computing techniques with original provenance-processing techniques.

O3. To relate traceability and explainability by specifying the rules by which automated decision-making pipelines have to log provenance (including structure and granularity) to support specific types of explanations

O4. To demonstrate, by means of user studies, that explanations conceived by our interdisciplinary approach are fit for purpose, adequate and meaningful.

O5. To architect and build Explanation Assistant, a tool that project partners, and more broadly any organisation controlling data, can use to provide their consumers or users with meaningful explanations about automated decisions.