Robust and Explainable Fault Diagnosis With Power-Perturbation-Based Decision Boundary Analysis of Deep Learning Models

被引:11
作者
Gwak, Minseon [1 ]
Kim, Min Su [1 ]
Yun, Jong Pil [2 ,3 ]
Park, PooGyeon [1 ]
机构
[1] Pohang Univ Sci & Technol, Dept Elect Engn, Pohang 37673, Gyungbuk, South Korea
[2] Korea Inst Ind Technol, Gyongsan 38408, South Korea
[3] Univ Sci & Technol, KITECH Sch, Daejeon 34113, South Korea
基金
新加坡国家研究基金会;
关键词
Data models; Vibrations; Feature extraction; Analytical models; Robustness; Perturbation methods; Time-domain analysis; Bearing; convolutional neural network (CNN); explainable artificial intelligence; interpretable machine learning; vibration signal; visualization; NEURAL-NETWORK; SIGNAL;
D O I
10.1109/TII.2022.3207758
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Robustness of neural network models is important in fault diagnosis (FD) because uncertainty in operating conditions varies the power spectral densities of vibration data; however, it is unknown to users due to the limited explainability of the models. This article proposes an FD framework with a power-perturbation-based decision boundary analysis (POBA) to explain the decision boundaries of vibration classification models. In the POBA, perturbed data are obtained from training data by power perturbation on frequency bands centering on dominant class-discriminative frequencies. The decision boundary of a model is then evaluated and visualized to users by testing the model on the perturbed data. Furthermore, the decision boundary information can be used to define a robustness score per class, and a robust model can be obtained by ensembling trained models using their robustness score per class. Demonstration using two vibration datasets verifies the explainability and robustness of the proposed FD framework.
引用
收藏
页码:6982 / 6992
页数:11
相关论文
共 32 条
[1]   Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI) [J].
Adadi, Amina ;
Berrada, Mohammed .
IEEE ACCESS, 2018, 6 :52138-52160
[2]   Machine Learning Interpretability: A Survey on Methods and Metrics [J].
Carvalho, Diogo, V ;
Pereira, Eduardo M. ;
Cardoso, Jaime S. .
ELECTRONICS, 2019, 8 (08)
[3]   Dynamic analysis of a planetary gear failure caused by tooth pitting and cracking [J].
Chaari F. ;
Fakhfakh T. ;
Haddar M. .
Journal of Failure Analysis and Prevention, 2006, 6 (02) :73-78
[4]   Vibration Signals Analysis by Explainable Artificial Intelligence (XAI) Approach: Application on Bearing Faults Diagnosis [J].
Chen, Han-Yun ;
Lee, Ching-Hung .
IEEE ACCESS, 2020, 8 :134246-134256
[5]   From Model, Signal to Knowledge: A Data-Driven Perspective of Fault Detection and Diagnosis [J].
Dai, Xuewu ;
Gao, Zhiwei .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2013, 9 (04) :2226-2238
[6]  
Doshi-Velez F, 2017, Arxiv, DOI arXiv:1702.08608
[7]   Vibration signal models for fault diagnosis of planetary gearboxes [J].
Feng, Zhipeng ;
Zuo, Ming J. .
JOURNAL OF SOUND AND VIBRATION, 2012, 331 (22) :4919-4939
[8]   Interpretable Explanations of Black Boxes by Meaningful Perturbation [J].
Fong, Ruth C. ;
Vedaldi, Andrea .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, :3449-3457
[9]   Identity Mappings in Deep Residual Networks [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
COMPUTER VISION - ECCV 2016, PT IV, 2016, 9908 :630-645
[10]   Fault Diagnosis and Fault-Tolerant Control in Linear Drives Using the Kalman Filter [J].
Huang, Sunan ;
Tan, Kok Kiong ;
Lee, Tong Heng .
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2012, 59 (11) :4285-4292