A human-interpretable machine learning pipeline based on ultrasound to support leiomyosarcoma diagnosis

被引:7
|
作者
Lombardi, Angela [1 ]
Arezzo, Francesca [2 ]
Di Sciascio, Eugenio [1 ]
Ardito, Carmelo [3 ]
Mongelli, Michele [4 ]
Di Lillo, Nicola [4 ]
Fascilla, Fabiana Divina [5 ]
Silvestris, Erica [2 ]
Kardhashi, Anila [2 ]
Putino, Carmela [4 ]
Cazzolla, Ambrogio [2 ]
Loizzi, Vera [2 ,6 ]
Cazzato, Gerardo [7 ]
Cormio, Gennaro [2 ,6 ]
Di Noia, Tommaso [1 ]
机构
[1] Politecn Bari, Dept Elect & Informat Engn DEI, Bari, Italy
[2] IRCCS Ist Tumori Giovanni Paolo II, Gynecol Oncol Unit, Interdisciplinar Dept Med, Bari, Italy
[3] LUM Giuseppe Degennaro Univ, Dept Engn, Casamassima, Bari, Italy
[4] Univ Bari Aldo Moro, Dept Biomed Sci & Human Oncol, Obstet & Gynecol Unit, Bari, Italy
[5] Di Venere Hosp, Obstet & Gynecol Unit, Bari, Italy
[6] Univ Bari Aldo Moro, Interdisciplinar Dept Med, Bari, Italy
[7] Univ Bari Aldo Moro, Dept Emergency & Organ Transplantat DETO, Sect Pathol, Bari, Italy
关键词
Human-centered AI; Machine learning; eXplainable artificial intelligence; Interpretability; Ultrasound; Leiomyosarcoma; CAD; DIFFERENTIAL-DIAGNOSIS; UTERINE SARCOMA; MORCELLATION; LEIOMYOMA; EXPLANATIONS; REGRESSION; SELECTION; OUTCOMES; IMPACT; CANCER;
D O I
10.1016/j.artmed.2023.102697
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The preoperative evaluation of myometrial tumors is essential to avoid delayed treatment and to establish the appropriate surgical approach. Specifically, the differential diagnosis of leiomyosarcoma (LMS) is particularly challenging due to the overlapping of clinical, laboratory and ultrasound features between fibroids and LMS. In this work, we present a human-interpretable machine learning (ML) pipeline to support the preoperative differential diagnosis of LMS from leiomyomas, based on both clinical data and gynecological ultrasound assessment of 68 patients (8 with LMS diagnosis). The pipeline provides the following novel contributions: (i) end-users have been involved both in the definition of the ML tasks and in the evaluation of the overall approach; (ii) clinical specialists get a full understanding of both the decision-making mechanisms of the ML algorithms and the impact of the features on each automatic decision. Moreover, the proposed pipeline addresses some of the problems concerning both the imbalance of the two classes by analyzing and selecting the best combination of the synthetic oversampling strategy of the minority class and the classification algorithm among different choices, and the explainability of the features at global and local levels. The results show very high performance of the best strategy (AUC = 0.99, F1 = 0.87) and the strong and stable impact of two ultrasound-based features (i.e., tumor borders and consistency of the lesions). Furthermore, the SHAP algorithm was exploited to quantify the impact of the features at the local level and a specific module was developed to provide a template-based natural language (NL) translation of the explanations for enhancing their interpretability and fostering the use of ML in the clinical setting.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Editorial: Human-Interpretable Machine Learning
    Tolomei, Gabriele
    Pinelli, Fabio
    Silvestri, Fabrizio
    FRONTIERS IN BIG DATA, 2022, 5
  • [2] Interpretability Is in the Mind of the Beholder: A Causal Framework for Human-Interpretable Representation Learning
    Marconato, Emanuele
    Passerini, Andrea
    Teso, Stefano
    ENTROPY, 2023, 25 (12)
  • [3] Dynamic simulation of natural gas pipeline network based on interpretable machine learning model
    Zhou, Dengji
    Jia, Xingyun
    Ma, Shixi
    Shao, Tiemin
    Huang, Dawen
    Hao, Jiarui
    Li, Taotao
    ENERGY, 2022, 253
  • [4] An interpretable machine learning prognostic system for locoregionally advanced nasopharyngeal carcinoma based on tumor burden features
    Chen, Xi
    Li, Yingxue
    Li, Xiang
    Cao, Xun
    Xiang, Yanqun
    Xia, Weixiong
    Li, Jianpeng
    Gao, Mingyong
    Sun, Yuyao
    Liu, Kuiyuan
    Qiang, Mengyun
    Liang, Chixiong
    Miao, Jingjing
    Cai, Zhuochen
    Guo, Xiang
    Li, Chaofeng
    Xie, Guotong
    Lv, Xing
    ORAL ONCOLOGY, 2021, 118
  • [5] An Interpretable Machine Learning Approach for Hepatitis B Diagnosis
    Obaido, George
    Ogbuokiri, Blessing
    Swart, Theo G.
    Ayawei, Nimibofa
    Kasongo, Sydney Mambwe
    Aruleba, Kehinde
    Mienye, Ibomoiye Domor
    Aruleba, Idowu
    Chukwu, Williams
    Osaye, Fadekemi
    Egbelowo, Oluwaseun F.
    Simphiwe, Simelane
    Esenogho, Ebenezer
    APPLIED SCIENCES-BASEL, 2022, 12 (21):
  • [6] BlastAssist: a deep learning pipeline to measure interpretable features of human embryos
    Yang, Helen Y.
    Leahy, Brian D.
    Jang, Won-Dong
    Wei, Donglai
    Kalma, Yael
    Rahav, Roni
    Carmon, Ariella
    Kopel, Rotem
    Azem, Foad
    Venturas, Marta
    Kelleher, Colm P.
    Cam, Liz
    Pfister, Hanspeter
    Needleman, Daniel J.
    Ben-Yosef, Dalit
    HUMAN REPRODUCTION, 2024, 39 (04) : 698 - 708
  • [7] Interpretable machine learning models for failure cause prediction in imbalanced oil pipeline data
    Awuku, Bright
    Huang, Ying
    Yodo, Nita
    Asa, Eric
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (07)
  • [8] Towards interpretable machine learning for clinical decision support
    Walters, Bradley
    Ortega-Martorell, Sandra
    Olier, Ivan
    Lisboa, Paulo J. G.
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [9] Interpretable Clinical Decision-Making Application for Etiological Diagnosis of Ventricular Tachycardia Based on Machine Learning
    Wang, Min
    Hu, Zhao
    Wang, Ziyang
    Chen, Haoran
    Xu, Xiaowei
    Zheng, Si
    Yao, Yan
    Li, Jiao
    DIAGNOSTICS, 2024, 14 (20)
  • [10] Using the Tsetlin Machine to Learn Human-Interpretable Rules for High-Accuracy Text Categorization With Medical Applications
    Berge, Geir Thore
    Granmo, Ole-Christoffer
    Tveit, Tor Oddbjorn
    Goodwin, Morten
    Jiao, Lei
    Matheussen, Bernt Viggo
    IEEE ACCESS, 2019, 7 : 115134 - 115146