BrAInVision: A hybrid explainable Artificial Intelligence framework for brain MRI analysis

被引:0
作者
Gagliardi, Marco [1 ,2 ]
Maurmo, Danilo [1 ,2 ]
Ruga, Tommaso [1 ,2 ]
Vocaturo, Eugenio [1 ,2 ]
Zumpano, Ester [1 ,2 ]
机构
[1] Univ Calabria, DIMES, Arcavacata Di Rende, CS, Italy
[2] CNR NANOTEC Natl Res Council, Arcavacata Di Rende, CS, Italy
关键词
Brain tumor; Machine learning; Hybrid features; Hand-crafted features; Explainable AI; ENTROPY;
D O I
10.1016/j.imavis.2025.105629
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Brain tumors pose a significant medical challenge, characterized by high incidence and mortality rates, which underscore the critical need for accurate and early diagnosis using minimally invasive techniques such as magnetic resonance imaging. In this context, Artificial Intelligence has emerged as a promising tool to enhance diagnostic precision and efficiency. However, its widespread adoption in clinical practice remains limited due to the opacity of Artificial Intelligence-driven decision-making processes. To address this challenge, we introduce BrAInVision, a hybrid and doubly explainable AI framework for brain tumor detection. The novelty of our approach is the integration of both deep learning and traditional machine learning techniques, combining deep extracted features with hand-crafted features to create a more robust and interpretable classification system. In contrast to conventional single-explanation methods, our framework provides comprehensive explainability through a multi-level analytical approach, enhancing both interpretability and transparency. The first level employs Grad-CAM to visualize regions of interest identified by the deep feature extractor, while the second level utilizes Permutation Feature Importance and Partial Dependence Plots to understand and quantify the contribution of specific image characteristics to diagnostic decisions. The proposed framework achieved an F1-score of 97% on the four classes (Glioma/Meningioma/Pituitary/NoTumor) and an average 99% in binary classification (Glioma/NoTumor), outperforming current state-of-the-art methods. The proposed approach has been validated on both the original dataset and an independent dataset with radiologist-annotated tumor masks, demonstrating strong generalizability. Designed for seamless integration into radiologists' workflows as a decision support system, BrAInVision ensures a high degree of explainability, thereby fostering greater trust in AI-assisted medical decision-making.
引用
收藏
页数:13
相关论文
共 48 条
[1]   Robust clinical applicable CNN and U-Net based algorithm for MRI classification and for brain tumor [J].
Akter, Atika ;
Nosheen, Nazeela ;
Ahmed, Sabbir ;
Hossain, Mariom ;
Abu Yousuf, Mohammad ;
Almoyad, Mohammad Ali Abdullah ;
Hasan, Khondokar Fida ;
Moni, Mohammad Ali .
EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238
[2]   A systematic review of trustworthy and explainable artificial intelligence in healthcare: Assessment of quality, bias risk, and data fusion [J].
Albahri, A. S. ;
Duhaim, Ali M. ;
Fadhel, Mohammed A. ;
Alnoor, Alhamzah ;
Baqer, Noor S. ;
Alzubaidi, Laith ;
Albahri, O. S. ;
Alamoodi, A. H. ;
Bai, Jinshuai ;
Salhi, Asma ;
Santamaria, Jose ;
Ouyang, Chun ;
Gupta, Ashish ;
Gu, Yuantong ;
Deveci, Muhammet .
INFORMATION FUSION, 2023, 96 :156-191
[3]  
Albalawi E, 2024, BMC MED IMAGING, V24, DOI 10.1186/s12880-024-01261-0
[4]   Explainable Artificial Intelligence (XAI): What we know and what is left to attain Trustworthy Artificial Intelligence [J].
Ali, Sajid ;
Abuhmed, Tamer ;
El-Sappagh, Shaker ;
Muhammad, Khan ;
Alonso-Moral, Jose M. ;
Confalonieri, Roberto ;
Guidotti, Riccardo ;
Del Ser, Javier ;
Diaz-Rodriguez, Natalia ;
Herrera, Francisco .
INFORMATION FUSION, 2023, 99
[5]   Brain tumor detection using proper orthogonal decomposition integrated with deep learning networks [J].
Appiah, Rita ;
Pulletikurthi, Venkatesh ;
Esquivel-Puentes, Helber Antonio ;
Cabrera, Cristiano ;
Hasan, Nahian I. ;
Dharmarathne, Suranga ;
Gomez, Luis J. ;
Castillo, Luciano .
COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2024, 250
[6]  
Arienzo G., 2024, INI-DH 2024, P3
[7]   RF-ShCNN: A combination of two deep models for tumor detection in brain using MRI [J].
Balasubramanian, Swaminathan ;
Mandala, Jyothi ;
Rao, Telu Venkata Madhusudhana ;
Misra, Alok .
BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 88
[8]  
Bisong E., 2019, Building machine learning and deep learning models on Google cloud platform: a comprehensive guide for beginners, P59, DOI [DOI 10.1007/978-1-4842-4470-8_7, 10.1007/978-1-4842-4470-8_19, DOI 10.1007/978-1-4842-4470-8_19]
[9]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[10]   Argumentation approaches for explanaible AI in medical informatics [J].
Caroprese, Luciano ;
Vocaturo, Eugenio ;
Zumpano, Ester .
INTELLIGENT SYSTEMS WITH APPLICATIONS, 2022, 16