Explanation of machine learning models using shapley additive explanation and application for real data in hospital

被引:264
|
作者
Nohara, Yasunobu [1 ]
Matsumoto, Koutarou [2 ]
Soejima, Hidehisa [3 ]
Nakashima, Naoki [4 ]
机构
[1] Kumamoto Univ, Kumamoto, Japan
[2] Kurume Univ, Fukuoka, Japan
[3] Saiseikai Kumamoto Hosp, Kumamoto, Japan
[4] Kyushu Univ Hosp, Fukuoka, Japan
关键词
Shapley additive explanation; Machine learning; Interpretability; Feature importance; Feature packing;
D O I
10.1016/j.cmpb.2021.106584
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Background and Objective: When using machine learning techniques in decision-making processes, the interpretability of the models is important. In the present paper, we adopted the Shapley additive explanation (SHAP), which is based on fair profit allocation among many stakeholders depending on their contribution, for interpreting a gradient-boosting decision tree model using hospital data. Methods: For better interpretability, we propose two novel techniques as follows: (1) a new metric of feature importance using SHAP and (2) a technique termed feature packing, which packs multiple similar features into one grouped feature to allow an easier understanding of the model without reconstruction of the model. We then compared the explanation results between the SHAP framework and existing methods using cerebral infarction data from our hospital. Results: The interpretation by SHAP was mostly consistent with that by the existing methods. We showed how the A/G ratio works as an important prognostic factor for cerebral infarction using proposed techniques. Conclusion: Our techniques are useful for interpreting machine learning models and can uncover the underlying relationships between features and outcome. (C) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:6
相关论文
共 50 条
  • [41] Choosing Prediction Over Explanation in Psychology: Lessons From Machine Learning
    Yarkoni, Tal
    Westfall, Jacob
    PERSPECTIVES ON PSYCHOLOGICAL SCIENCE, 2017, 12 (06) : 1100 - 1122
  • [42] Facilitating Machine Learning Model Comparison and Explanation through a Radial Visualisation
    Zhou, Jianlong
    Huang, Weidong
    Chen, Fang
    ENERGIES, 2021, 14 (21)
  • [43] Automatic Extraction of Ontological Explanation for Machine Learning-Based Systems
    Chondamrongkul, Nacha
    Temdee, Punnarumol
    INTERNATIONAL JOURNAL OF SOFTWARE ENGINEERING AND KNOWLEDGE ENGINEERING, 2023, 33 (01) : 133 - 156
  • [44] Explanation of Machine-Learning Solutions in Air-Traffic Management
    Xie, Yibing
    Pongsakornsathien, Nichakorn
    Gardi, Alessandro
    Sabatini, Roberto
    AEROSPACE, 2021, 8 (08)
  • [45] Enhancing Attention's Explanation Using Interpretable Tsetlin Machine
    Yadav, Rohan Kumar
    Nicolae, Dragos Constantin
    ALGORITHMS, 2022, 15 (05)
  • [46] Data analysis with Shapley values for automatic subject selection in Alzheimer’s disease data sets using interpretable machine learning
    Louise Bloch
    Christoph M. Friedrich
    Alzheimer's Research & Therapy, 13
  • [47] Data analysis with Shapley values for automatic subject selection in Alzheimer's disease data sets using interpretable machine learning
    Bloch, Louise
    Friedrich, Christoph M.
    ALZHEIMERS RESEARCH & THERAPY, 2021, 13 (01)
  • [48] Viscosity and melting temperature prediction of mold fluxes based on explainable machine learning and SHapley additive exPlanations
    Yan, Wei
    Shen, Yangyang
    Chen, Shoujie
    Wang, Yongyuan
    JOURNAL OF NON-CRYSTALLINE SOLIDS, 2024, 636
  • [49] Machine Learning Models Based on Grid-Search Optimization and Shapley Additive Explanations (SHAP) for Early Stroke Prediction
    Al Mamlook, Rabia Emhamed
    Lahwal, Fathia
    Elgeberi, Najat
    Obeidat, Muhammad
    Al-Na'amneh, Qais
    Nasayreh, Ahmad
    Gharaibeh, Hasan
    Gharaibeh, Tasnim
    Bzizi, Hanin
    4TH INTERDISCIPLINARY CONFERENCE ON ELECTRICS AND COMPUTER, INTCEC 2024, 2024,
  • [50] Application of machine learning methods on real bridge monitoring data
    Wedel, Frederik
    Marx, Steffen
    ENGINEERING STRUCTURES, 2022, 250