CHIST-ERA Call 2019 (step 2) - 10ème Appel à Projets de l'ERA-NET CHIST-ERA (step 2)

ArgumeNtaTIon-Driven explainable artificial intelligence fOr digiTal mEdicine – ANTIDOTE

Submission summary

Providing high quality explanations for AI predictions based on machine learning is a challenging and complex task. To work well it requires, among other factors: selecting a proper level of generality/specificity of the explanation; considering assumptions about the familiarity of the explanation beneficiary with the AI task under consideration; referring to specific elements that have contributed to the decision; making use of additional knowledge (e.g. metadata) which might not be part of the prediction process; selecting appropriate examples; providing evidence supporting negative hypothesis. Finally, the system needs to formulate the explanation in a clearly interpretable, and possibly convincing, way.
Given these considerations, ANTIDOTE fosters an integrated vision of explainable AI, where low level characteristics of the deep learning process are combined with higher level schemas proper of the human argumentation capacity. The ANTIDOTE integrated vision is supported by three considerations: (i) in neural architectures the correlation between internal states of the network (e.g., weights assumed by single nodes) and the justification of the network classification outcome is not well studied; (ii) high quality explanations are crucially based on argumentation mechanisms (e.g., provide supporting examples and rejected alternatives), that are, to a large extent, task independent; (iii) in real settings, providing explanations is inherently an interactive process, where an explanatory dialogue takes place between the system and the user. Accordingly, ANTIDOTE will exploit cross-disciplinary competences in three areas, i.e., deep learning, argumentation and interactivity, to support a broader and innovative view of explainable AI. Although we envision a general integrated approach to explainable AI, we will focus on a number of deep learning tasks in the medical domain, where the need for high quality explanations for clinical cases deliberation is critical.

Project coordination

Elena Cabrio (Université Côte d'Azur - Laboratoire informatique, signaux systèmes de Sophia Antipolis)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partner

UCA - I3S Université Côte d'Azur - Laboratoire informatique, signaux systèmes de Sophia Antipolis
UPV/EHU University of the Basque Country UPV/EHU
FBK Fondazione Bruno Kessler
NOVA Universidade Nova de Lisboa
KU Leuven Katholieke Universiteit Leuven

Help of the ANR 358,851 euros
Beginning and duration of the scientific project: March 2021 - 36 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter