Explainable artificial intelligence (XAI) in radiology and nuclear medicine: a literature review

被引:25
作者
de Vries, Bart M. [1 ]
Zwezerijnen, Gerben J. C. [1 ]
Burchell, George L. [2 ]
van Velden, Floris H. P. [3 ]
van Oordt, Catharina Willemien Menke-van der Houven [4 ]
Boellaard, Ronald [1 ]
机构
[1] Vrije Univ Amsterdam, Canc Ctr Amsterdam, Dept Radiol & Nucl Med, Amsterdam UMC, Amsterdam, Netherlands
[2] Vrije Univ Amsterdam, Med Lib, Amsterdam, Netherlands
[3] Leiden Univ, Dept Radiol, Med Ctr, Leiden, Netherlands
[4] Vrije Univ Amsterdam, Canc Ctr Amsterdam, Dept Oncol, Amsterdam UMC, Amsterdam, Netherlands
关键词
deep learning; explainable artificial intelligence; magnetic resonance (MR) imaging; computed tomography (CT) imaging; positron emission tomography (PET) imaging; NEURAL-NETWORKS; DIAGNOSIS; CLASSIFICATION; SYSTEM; MRI;
D O I
10.3389/fmed.2023.1180773
中图分类号
R5 [内科学];
学科分类号
1002 ; 100201 ;
摘要
Rational: Deep learning (DL) has demonstrated a remarkable performance in diagnostic imaging for various diseases and modalities and therefore has a high potential to be used as a clinical tool. However, current practice shows low deployment of these algorithms in clinical practice, because DL algorithms lack transparency and trust due to their underlying black-box mechanism. For successful employment, explainable artificial intelligence (XAI) could be introduced to close the gap between the medical professionals and the DL algorithms. In this literature review, XAI methods available for magnetic resonance (MR), computed tomography (CT), and positron emission tomography (PET) imaging are discussed and future suggestions are made.Methods: PubMed, and Clarivate Analytics/Web of Science Core Collection were screened. Articles were considered eligible for inclusion if XAI was used (and well described) to describe the behavior of a DL model used in MR, CT and PET imaging.Results: A total of 75 articles were included of which 54 and 17 articles described post and ad hoc XAI methods, respectively, and 4 articles described both XAI methods. Major variations in performance is seen between the methods. Overall, post hoc XAI lacks the ability to provide class-discriminative and target-specific explanation. Ad hoc XAI seems to tackle this because of its intrinsic ability to explain. However, quality control of the XAI methods is rarely applied and therefore systematic comparison between the methods is difficult.Conclusion: There is currently no clear consensus on how XAI should be deployed in order to close the gap between medical professionals and DL algorithms for clinical implementation. We advocate for systematic technical and clinical quality assessment of XAI methods. Also, to ensure end-to-end unbiased and safe integration of XAI in clinical workflow, (anatomical) data minimization and quality control methods should be included.
引用
收藏
页数:14
相关论文
共 96 条
[1]   MIXCAPS: A capsule network-based mixture of experts for lung nodule malignancy prediction [J].
Afshar, Parnian ;
Naderkhani, Farnoosh ;
Oikonomou, Anastasia ;
Rafiee, Moezedin Javad ;
Mohammadi, Arash ;
Plataniotis, Konstantinos N. .
PATTERN RECOGNITION, 2021, 116
[2]   Detection of COVID-19 Patients from CT Scan and Chest X-ray Data Using Modified MobileNetV2 and LIME [J].
Ahsan, Md Manjurul ;
Nazim, Redwan ;
Siddique, Zahed ;
Huebner, Pedro .
HEALTHCARE, 2021, 9 (09)
[3]   COVID-19 Symptoms Detection Based on NasNetMobile with Explainable AI Using Various Imaging Modalities [J].
Ahsan, Md Manjurul ;
Gupta, Kishor Datta ;
Islam, Mohammad Maminur ;
Sen, Sajib ;
Rahman, Md. Lutfar ;
Hossain, Mohammad Shakhawat .
MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2020, 2 (04) :490-504
[4]   Illuminating Clues of Cancer Buried in Prostate MR Image: Deep Learning and Expert Approaches [J].
Akatsuka, Jun ;
Yamamoto, Yoichiro ;
Sekine, Tetsuro ;
Numata, Yasushi ;
Morikawa, Hiromu ;
Tsutsumi, Kotaro ;
Yanagi, Masato ;
Endo, Yuki ;
Takeda, Hayato ;
Hayashi, Tatsuro ;
Ueki, Masao ;
Tamiya, Gen ;
Maeda, Ichiro ;
Fukumoto, Manabu ;
Shimizu, Akira ;
Tsuzuki, Toyonori ;
Kimura, Go ;
Kondo, Yukihiro .
BIOMOLECULES, 2019, 9 (11)
[5]   Artificial Intelligence Applications on Restaging [18F]FDG PET/CT in Metastatic Colorectal Cancer: A Preliminary Report of Morpho-Functional Radiomics Classification for Prediction of Disease Outcome [J].
Alongi, Pierpaolo ;
Stefano, Alessandro ;
Comelli, Albert ;
Spataro, Alessandro ;
Formica, Giuseppe ;
Laudicella, Riccardo ;
Lanzafame, Helena ;
Panasiti, Francesco ;
Longo, Costanza ;
Midiri, Federico ;
Benfante, Viviana ;
La Grutta, Ludovico ;
Burger, Irene Andrea ;
Bartolotta, Tommaso Vincenzo ;
Baldari, Sergio ;
Lagalla, Roberto ;
Midiri, Massimo ;
Russo, Giorgio .
APPLIED SCIENCES-BASEL, 2022, 12 (06)
[6]   Explainable COVID-19 Detection Using Chest CT Scans and Deep Learning [J].
Alshazly, Hammam ;
Linse, Christoph ;
Barth, Erhardt ;
Martinetz, Thomas .
SENSORS, 2021, 21 (02) :1-22
[7]   Overview of artificial intelligence in medicine [J].
Amisha ;
Malik, Paras ;
Pathania, Monika ;
Rathaur, Vyas Kumar .
JOURNAL OF FAMILY MEDICINE AND PRIMARY CARE, 2019, 8 (07) :2328-2331
[8]   Towards explainable deep neural networks (xDNN) [J].
Angelov, Plamen ;
Soares, Eduardo .
NEURAL NETWORKS, 2020, 130 (130) :185-194
[9]   The EU medical device regulation: Implications for artificial intelligence-based medical device software in medical physics [J].
Beckers, R. ;
Kwade, Z. ;
Zanca, F. .
PHYSICA MEDICA-EUROPEAN JOURNAL OF MEDICAL PHYSICS, 2021, 83 :1-8
[10]   TorchEsegeta: Framework for Interpretability and Explainability of Image-Based Deep Learning Models [J].
Chatterjee, Soumick ;
Das, Arnab ;
Mandal, Chirag ;
Mukhopadhyay, Budhaditya ;
Vipinraj, Manish ;
Shukla, Aniruddh ;
Rao, Rajatha Nagaraja ;
Sarasaen, Chompunuch ;
Speck, Oliver ;
Nuernberger, Andreas .
APPLIED SCIENCES-BASEL, 2022, 12 (04)