Quantitative Evaluation for Robustness of Intelligent Fault Diagnosis Algorithms Based on Self-attention Mechanism

被引:0
|
作者
Liu, He [1 ,2 ]
Wei, Cheng [1 ]
Sun, Bo [2 ]
机构
[1] Harbin Inst Technol, Sch Astronaut, Harbin, Peoples R China
[2] China Acad Space Technol, Beijing Inst Spacecraft Syst Engn, Beijing, Peoples R China
来源
JOURNAL OF INTERNET TECHNOLOGY | 2024年 / 25卷 / 06期
关键词
Implantation noise; Sub indicators; Robustness; Self-attention;
D O I
10.70003/160792642024112506012
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Currently, various algorithmic models encounter numerous challenges in practical applications, such as noise, interference and input changes, which can significantly impact their performance. Many methods have been proposed to enhance model robustness. However, to assess the effectiveness of these improvements, it is generally necessary to compare the model's performance before and after applying the same noise and analyze the resulting changes. Moreover, to evaluate the robustness of multiple models that meet basic requirements for a specific task, a qualitative analysis is performed using specific indicators. This is especially crucial in fault diagnosis where multiple types of noise interference in the data can hinder accurate fault classification. Addressing this situation, this paper presents a quantitative evaluation method for the robustness of intelligent fault diagnosis algorithms based on the self- attention mechanism. The proposed method entails dividing the dataset into sub-datasets according to signal-to-noise ratio after injecting noise, separately calculating sub-indicators after training, dynamically assigning weights to these indicators using the self-attention mechanism and combining the weights of different sub-indicators to generate a comprehensive evaluation value for assessing robustness. The proposed method is validated through experiments involving three models, and the results demonstrate the reliability of this quantitative calculation approach for robustness.
引用
收藏
页码:921 / 929
页数:9
相关论文
共 50 条
  • [31] Deepfake face discrimination based on self-attention mechanism
    Wang, Shuai
    Zhu, Donghui
    Chen, Jian
    Bi, Jiangbo
    Wang, Wenyi
    PATTERN RECOGNITION LETTERS, 2024, 183 : 92 - 97
  • [32] Web service classification based on self-attention mechanism
    Jia, Zhichun
    Zhang, Zhiying
    Dong, Rui
    Yang, Zhongxuan
    Xing, Xing
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 2164 - 2169
  • [33] Progressive Scene Segmentation Based on Self-Attention Mechanism
    Pan, Yunyi
    Gan, Yuan
    Liu, Kun
    Zhang, Yan
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 3985 - 3992
  • [34] A hybrid network for three-dimensional seismic fault segmentation based on nested residual attention and self-attention mechanism
    Sun, Qifeng
    Jiang, Hui
    Du, Qizhen
    Gong, Faming
    GEOPHYSICAL PROSPECTING, 2025, 73 (02) : 575 - 594
  • [35] A Self-Attention Legendre Graph Convolution Network for Rotating Machinery Fault Diagnosis
    Ma, Jiancheng
    Huang, Jinying
    Liu, Siyuan
    Luo, Jia
    Jing, Licheng
    SENSORS, 2024, 24 (17)
  • [36] A rolling bearing fault diagnosis method for imbalanced data based on multi-scale self-attention mechanism and novel loss function
    Qiang Ruiru
    Zhao Xiaoqiang
    INSIGHT, 2024, 66 (11) : 690 - 701
  • [37] Self-Attention Parallel Fusion Network for Wind Turbine Gearboxes Fault Diagnosis
    Yang, Qichao
    Tang, Baoping
    Shen, Yizhe
    Li, Qikang
    IEEE SENSORS JOURNAL, 2023, 23 (19) : 23210 - 23220
  • [38] A Novel Small Samples Fault Diagnosis Method Based on the Self-attention Wasserstein Generative Adversarial Network
    Zhiwu Shang
    Jie Zhang
    Wanxiang Li
    Shiqi Qian
    Jingyu Liu
    Maosheng Gao
    Neural Processing Letters, 2023, 55 : 6377 - 6407
  • [39] A Novel Small Samples Fault Diagnosis Method Based on the Self-attention Wasserstein Generative Adversarial Network
    Shang, Zhiwu
    Zhang, Jie
    Li, Wanxiang
    Qian, Shiqi
    Liu, Jingyu
    Gao, Maosheng
    NEURAL PROCESSING LETTERS, 2023, 55 (05) : 6377 - 6407
  • [40] Self-Attention Metric Learning Based on Multiscale Feature Fusion for Few-Shot Fault Diagnosis
    Xie, Jingsong
    Liu, Jie
    Ding, Tianqi
    Wang, Tiantian
    Yu, Tianjian
    IEEE SENSORS JOURNAL, 2023, 23 (17) : 19771 - 19782