Artificial Intelligence systems have the potential to revolutionize the field of medicine by increasing the efficiency of the healthcare sector and improving the quality of care. However, this transformation requires trust from medical professionals in these systems, a trust that can only be achieved through understanding. For this reason, a new research field has emerged: eXplainable Artificial Intelligence (XAI), which aims to explain the decision-making process of Artificial Intelligence algorithms. XAI is essential in high-stakes environments, such as medicine, where a wrong decision can seriously affect human lives. This study aims to analyse, using various explainability methods, the decision of a diagnostic support model for Distal Myopathies, a rare form of Neuromuscular disease. It also proposes new explainability techniques: a novel approach to occlusion, called hierarchical occlusion and the use of ensemble methods to combine individual explanations to generate more refined outputs. Finally, it evaluates the results of explainability methods through the feedback of different expert observers and discusses their performance, limitations and the potential impact on trust and usability in clinical practice.

Explainable Artificial Intelligence for Diagnostic Support: an application to Distal Myopathies

Frasson, Giada
2024/2025

Abstract

Artificial Intelligence systems have the potential to revolutionize the field of medicine by increasing the efficiency of the healthcare sector and improving the quality of care. However, this transformation requires trust from medical professionals in these systems, a trust that can only be achieved through understanding. For this reason, a new research field has emerged: eXplainable Artificial Intelligence (XAI), which aims to explain the decision-making process of Artificial Intelligence algorithms. XAI is essential in high-stakes environments, such as medicine, where a wrong decision can seriously affect human lives. This study aims to analyse, using various explainability methods, the decision of a diagnostic support model for Distal Myopathies, a rare form of Neuromuscular disease. It also proposes new explainability techniques: a novel approach to occlusion, called hierarchical occlusion and the use of ensemble methods to combine individual explanations to generate more refined outputs. Finally, it evaluates the results of explainability methods through the feedback of different expert observers and discusses their performance, limitations and the potential impact on trust and usability in clinical practice.
File in questo prodotto:
File Dimensione Formato  
869359-1297390.pdf

non disponibili

Tipologia: Altro materiale allegato
Dimensione 5.15 MB
Formato Adobe PDF
5.15 MB Adobe PDF   Richiedi una copia

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14247/22928