Interpretable artificial intelligence in radiology and radiation oncology

被引:5
作者
Cui, Sunan [1 ]
Traverso, Alberto [2 ]
Niraula, Dipesh [3 ]
Zou, Jiaren [4 ]
Luo, Yi [3 ]
Owen, Dawn [5 ]
El Naqa, Issam [3 ]
Wei, Lise [4 ]
机构
[1] Univ Washington, Dept Radiat Oncol, Seattle, WA USA
[2] Dept Radiotherapy, Maastro Clin, Maastricht, Netherlands
[3] H Lee Moffitt Canc Ctr & Res Inst, Dept Machine Learning, Tampa, FL USA
[4] Univ Michigan, Dept Radiat Oncol, Ann Arbor, MI 48109 USA
[5] Mayo Clin, Dept Radiat Oncol, Rochester, MN USA
关键词
PREDICTION; CARE;
D O I
10.1259/bjr.20230142
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
Artificial intelligence has been introduced to clinical practice, especially radiology and radiation oncology, from image segmentation, diagnosis, treatment planning and prognosis. It is not only crucial to have an accurate artificial intelligence model, but also to understand the internal logic and gain the trust of the experts. This review is intended to provide some insights into core concepts of the interpretability, the state- of- the- art methods for understanding the machine learning models, the evaluation of these methods, identifying some challenges and limits of them, and gives some examples of medical applications.
引用
收藏
页数:10
相关论文
共 85 条
  • [1] Data-driven identification of prognostic tumor subpopulations using spatially mapped t-SNE of mass spectrometry imaging data
    Abdelmoula, Walid M.
    Balluff, Benjamin
    Englert, Sonja
    Dijkstra, Jouke
    Reinders, Marcel J. T.
    Walch, Axel
    McDonnell, Liam A.
    Lelieveldt, Boudewijn P. F.
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2016, 113 (43) : 12244 - 12249
  • [2] Explainability for artificial intelligence in healthcare: a multidisciplinary perspective
    Amann, Julia
    Blasimme, Alessandro
    Vayena, Effy
    Frey, Dietmar
    Madai, Vince I.
    [J]. BMC MEDICAL INFORMATICS AND DECISION MAKING, 2020, 20 (01)
  • [3] Ancona M., 2019, EXPLAINABLE INTERPRE, P169, DOI [DOI 10.1007/978-3-030-28954-6_9, 10.1007/978-3-030-28954-6_9, DOI 10.1007/978-3-030, DOI 10.1007/978-3-030-28954]
  • [4] On Pixel-Wise Explanations for Non-Linear Classifier Decisions by Layer-Wise Relevance Propagation
    Bach, Sebastian
    Binder, Alexander
    Montavon, Gregoire
    Klauschen, Frederick
    Mueller, Klaus-Robert
    Samek, Wojciech
    [J]. PLOS ONE, 2015, 10 (07):
  • [5] Balasubramanian M, 2002, SCIENCE, V295
  • [6] Towards a safe and efficient clinical implementation of machine learning in radiation oncology by exploring model interpretability, explainability and data-model dependency
    Barragan-Montero, Ana
    Bibal, Adrien
    Dastarac, Margerie Huet
    Draguet, Camille
    Valdes, Gilmer
    Nguyen, Dan
    Willems, Siri
    Vandewinckele, Liesbeth
    Holmstrom, Mats
    Lofman, Fredrik
    Souris, Kevin
    Sterpin, Edmond
    Lee, John A.
    [J]. PHYSICS IN MEDICINE AND BIOLOGY, 2022, 67 (11)
  • [7] Dimensionality reduction for visualizing single-cell data using UMAP
    Becht, Etienne
    McInnes, Leland
    Healy, John
    Dutertre, Charles-Antoine
    Kwok, Immanuel W. H.
    Ng, Lai Guan
    Ginhoux, Florent
    Newell, Evan W.
    [J]. NATURE BIOTECHNOLOGY, 2019, 37 (01) : 38 - +
  • [8] Been K, 2018, C COMP VIS PATT REC
  • [9] Evaluation of Importance Estimators in Deep Learning Classifiers for Computed Tomography
    Brocki, Lennart
    Marchadour, Wistan
    Maison, Jonas
    Badic, Bogdan
    Papadimitroulas, Panagiotis
    Hatt, Mathieu
    Vermet, Franck
    Chung, Neo Christopher
    [J]. EXPLAINABLE AND TRANSPARENT AI AND MULTI-AGENT SYSTEMS, EXTRAAMAS 2022, 2022, 13283 : 3 - 18
  • [10] Machine learning applications in radiation oncology: Current use and needs to support clinical implementation
    Brouwer, Charlotte L.
    Dinkla, Anna M.
    Vandewinckele, Liesbeth
    Crijns, Wouter
    Claessens, Michael
    Verellen, Dirk
    van Elmpt, Wouter
    [J]. PHYSICS & IMAGING IN RADIATION ONCOLOGY, 2020, 16 : 144 - 148