Argumentative Explanations in AI

August 30, 1:45 pm - 3:15 pm (CEST)

Speakers: Antonio Rago & Francesca Toni

Tutorial website: https://www.doc.ic.ac.uk/~afr114/ecaitutorial/

Agenda: Argumentative Explanations in AI

As AI becomes ever more ubiquitous in our everyday lives, its ability to explain to and interact with humans is evolving into a critical research area. Explainable AI (XAI) has therefore emerged as a popular topic but its research landscape is currently very fragmented. A general-purpose, systematic approach for addressing the two challenges of explainability and anthropomorphisation in symphony to form the basis of an AI-supported but human-centred society is critical for the success of XAI.

We will provide an extensive introduction to the ways in which argumentation can be used to address these two challenges in symphony via the extraction of bespoke, human-like explanations for AI systems. We will review recent literature in which this has been achieved in one of two ways: by building explainable systems with argumentative foundations from scratch or by extracting argumentative reasoning from general AI systems.

We will motivate and explain a topic of emerging importance, namely Argumentative Explanations in AI, which is itself a novel synthesis combining two distinct lines of AI work. Argumentation is an established branch of knowledge representation which has historically been well represented in the ECAI community, while XAI is arguably one of the biggest concerns du jour for the AI community in general.

This tutorial is aimed at any AI researcher interested in how argumentation can contribute to the timely field of XAI. It will be self-contained, with basic background on argumentation and XAI provided.

Antonio Rago, Department of Computing, Imperial College London

Francesca Toni, Department of Computing, Imperial College London