Quantitative Evaluation for Robustness of Intelligent Fault Diagnosis Algorithms Based on Self-attention Mechanism

被引:0
|
作者
Liu, He [1 ,2 ]
Wei, Cheng [1 ]
Sun, Bo [2 ]
机构
[1] Harbin Inst Technol, Sch Astronaut, Harbin, Peoples R China
[2] China Acad Space Technol, Beijing Inst Spacecraft Syst Engn, Beijing, Peoples R China
来源
JOURNAL OF INTERNET TECHNOLOGY | 2024年 / 25卷 / 06期
关键词
Implantation noise; Sub indicators; Robustness; Self-attention;
D O I
10.70003/160792642024112506012
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Currently, various algorithmic models encounter numerous challenges in practical applications, such as noise, interference and input changes, which can significantly impact their performance. Many methods have been proposed to enhance model robustness. However, to assess the effectiveness of these improvements, it is generally necessary to compare the model's performance before and after applying the same noise and analyze the resulting changes. Moreover, to evaluate the robustness of multiple models that meet basic requirements for a specific task, a qualitative analysis is performed using specific indicators. This is especially crucial in fault diagnosis where multiple types of noise interference in the data can hinder accurate fault classification. Addressing this situation, this paper presents a quantitative evaluation method for the robustness of intelligent fault diagnosis algorithms based on the self- attention mechanism. The proposed method entails dividing the dataset into sub-datasets according to signal-to-noise ratio after injecting noise, separately calculating sub-indicators after training, dynamically assigning weights to these indicators using the self-attention mechanism and combining the weights of different sub-indicators to generate a comprehensive evaluation value for assessing robustness. The proposed method is validated through experiments involving three models, and the results demonstrate the reliability of this quantitative calculation approach for robustness.
引用
收藏
页码:921 / 929
页数:9
相关论文
共 50 条
  • [21] A lightweight and rapidly converging transformer based on separable linear self-attention for fault diagnosis
    Yin, Kexin
    Chen, Chunjun
    Shen, Qi
    Deng, Ji
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2025, 36 (01)
  • [22] A novel time-frequency Transformer based on self-attention mechanism and its application in fault diagnosis of rolling bearings
    Ding, Yifei
    Jia, Minping
    Miao, Qiuhua
    Cao, Yudong
    MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2022, 168
  • [23] Fault diagnosis of rotating machinery using novel self-attention mechanism TCN with soft thresholding method
    Ding, Li
    Li, Qing
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (04)
  • [24] CLFormer: A Lightweight Transformer Based on Convolutional Embedding and Linear Self-Attention With Strong Robustness for Bearing Fault Diagnosis Under Limited Sample Conditions
    Fang, Hairui
    Deng, Jin
    Bai, Yaoxu
    Feng, Bo
    Li, Sheng
    Shao, Siyu
    Chen, Dongsheng
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [25] A novel data augmentation method for steering mechanism fault diagnosis based on variational autoencoding generative adversarial networks with self-attention
    Lei, Tongfei
    Pei, Zeyu
    Pan, Feng
    Li, Bing
    Xu, Yongsheng
    Shao, Haidong
    Zhao, Ke
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (05)
  • [26] Bearing fault diagnosis network based on adaptive dimension-increasing and convolutional self-attention
    Guan, Le
    Wang, Xinyang
    Yang, Duo
    Zhang, Tianqi
    Zhu, Li
    Chen, Jianguo
    Wang, Zhen
    Zhendong yu Chongji/Journal of Vibration and Shock, 2024, 43 (17): : 289 - 299
  • [27] An Incipient Fault Diagnosis Method Based on Complex Convolutional Self-Attention Autoencoder for Analog Circuits
    Gao, Tianyu
    Yang, Jingli
    Jiang, Shouda
    Li, Ye
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2024, 71 (08) : 9727 - 9736
  • [28] Face Inpainting Based on Dual Self-attention Mechanism
    Yue H.
    Liao L.
    Yang J.
    Hunan Daxue Xuebao/Journal of Hunan University Natural Sciences, 2023, 50 (08): : 32 - 41
  • [29] Multimodal Fusion Method Based on Self-Attention Mechanism
    Zhu, Hu
    Wang, Ze
    Shi, Yu
    Hua, Yingying
    Xu, Guoxia
    Deng, Lizhen
    WIRELESS COMMUNICATIONS & MOBILE COMPUTING, 2020, 2020
  • [30] Double Attention: An Optimization Method for the Self-Attention Mechanism Based on Human Attention
    Zhang, Zeyu
    Li, Bin
    Yan, Chenyang
    Furuichi, Kengo
    Todo, Yuki
    BIOMIMETICS, 2025, 10 (01)