Physiological signal analysis using explainable artificial intelligence: A systematic review

被引:1
作者
Shen, Jian [1 ]
Wu, Jinwen [1 ]
Liang, Huajian [1 ]
Zhao, Zeguang [1 ]
Li, Kunlin [2 ]
Zhu, Kexin [1 ]
Wang, Kang [1 ]
Ma, Yu [1 ]
Hu, Wenbo [1 ]
Guo, Chenxu [1 ]
Zhang, Yanan [1 ]
Hu, Bin [1 ]
机构
[1] Beijing Inst Technol, Sch Med Technol, Beijing 100081, Peoples R China
[2] Hebei Univ, Sch Elect Informat Engn, Baoding 071000, Peoples R China
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Physiological signals; Artificial intelligence; Interpretable modeling; Medical and health; COMPUTER-AIDED DETECTION; EMOTION RECOGNITION; CONVOLUTIONAL TRANSFORMER; DEPRESSION RECOGNITION; ECG SIGNALS; EEG; SLEEP; DATABASE; PREDICTION; ASYMMETRY;
D O I
10.1016/j.neucom.2024.128920
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the continuous development of wearable sensors, it has become increasingly convenient to collect various physiological signals from the human body. The combination of Artificial Intelligence (AI) technology and various physiological signals has significantly improved people's awareness of their psychological and physiological states, thus promoting substantial progress in the medical and health industries. However, most current research on physiological signal modeling does not consider the issue of interpretability, which poses a significant challenge for clinical diagnosis and treatment support. Interpretability refers to the explanation of the internal workings of AI models when generating decision results and is regarded as an important foundation for understanding model operations. Despite substantial progress made in this field in recent years, there remains a lack of systematic discussion regarding interpretable AI in physiological signal modeling, resulting in researchers having difficulty comprehensively grasping the latest developments and emerging trends in the field. Therefore, this paper provides a systematic review of interpretable AI technologies in the domain of physiological signals. Based on the scope of interpretability, these technologies are divided into two categories: global and local interpretability, and we conduct an in-depth analysis and comparison of these two types of technologies. Subsequently, we explore the potential applications of interpretable physiological signal modeling in areas such as medicine and healthcare. Finally, we summarize the key challenges of interpretable AI in the context of physiological signals and discuss future research directions. This study aims to provide researchers with a systematic framework to better understand and apply interpretable AI technologies and lay the foundation for future research.
引用
收藏
页数:23
相关论文
共 200 条
[51]   EEG emotion recognition using attention-based convolutional transformer neural network [J].
Gong, Linlin ;
Li, Mingyang ;
Zhang, Tao ;
Chen, Wanzhong .
BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 84
[52]   XAI for myo-controlled prosthesis: Explaining EMG data for hand gesture classification [J].
Gozzi, Noemi ;
Malandri, Lorenzo ;
Mercorio, Fabio ;
Pedrocchi, Alessandra .
KNOWLEDGE-BASED SYSTEMS, 2022, 240
[53]   Toward Deep Generalization of Peripheral EMG-Based Human-Robot Interfacing: A Hybrid Explainable Solution for NeuroRobotic Systems [J].
Gulati, Paras ;
Hu, Qin ;
Atashzar, S. Farokh .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2021, 6 (02) :2650-2657
[54]   Automated Detection and Localization of Myocardial Infarction With Interpretability Analysis Based on Deep Learning [J].
Han, Chuang ;
Sun, Jiajia ;
Bian, Yingnan ;
Que, Wenge ;
Shi, Li .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
[55]   Deep learning models for electrocardiograms are susceptible to adversarial attack [J].
Han, Xintian ;
Hu, Yuxuan ;
Foschini, Luca ;
Chinitz, Larry ;
Jankelson, Lior ;
Ranganath, Rajesh .
NATURE MEDICINE, 2020, 26 (03) :360-+
[56]  
Hasan M., 2021, INT C INF EL VIS, P1
[57]  
Hastie Trevor, 2009, The Elements of Statistical Learning: Data Mining, Inference, and Prediction, V2
[58]   Multimodal Multitask Neural Network for Motor Imagery Classification With EEG and fNIRS Signals [J].
He, Qun ;
Feng, Lufeng ;
Jiang, Guoqian ;
Xie, Ping .
IEEE SENSORS JOURNAL, 2022, 22 (21) :20695-20706
[59]   Explaining deep neural networks for knowledge discovery in electrocardiogram analysis [J].
Hicks, Steven A. ;
Isaksen, Jonas L. ;
Thambawita, Vajira ;
Ghouse, Jonas ;
Ahlberg, Gustav ;
Linneberg, Allan ;
Grarup, Niels ;
Strumke, Inga ;
Ellervik, Christina ;
Olesen, Morten Salling ;
Hansen, Torben ;
Graff, Claus ;
Holstein-Rathlou, Niels-Henrik ;
Halvorsen, Pal ;
Maleckar, Mary M. ;
Riegler, Michael A. ;
Kanters, Jorgen K. .
SCIENTIFIC REPORTS, 2021, 11 (01)
[60]  
Hong SD, 2019, Arxiv, DOI [arXiv:1905.11333, DOI 10.48550/ARXIV.1905.11333]