Acoustic Emission Recognition Based on a Three-Streams Neural Network with Attention

被引:0
|
作者
Xiaofeng K. [1 ]
Kun H. [2 ]
Li R. [3 ]
机构
[1] College of Information and Engineering, Xuzhou University of Technology, Jiangsu, Xuzhou
[2] College of Electrical and Power Engineering, China University of Mining and Technology, Jiangsu, Xuzhou
[3] Department of Electrical, Electronic and Computer Engineering, University of Western Australia, Perth
来源
Computer Systems Science and Engineering | 2023年 / 46卷 / 03期
基金
英国科研创新办公室;
关键词
acoustic emission; attention mechanism; Convolutional neural network; fault detection;
D O I
10.32604/csse.2023.025908
中图分类号
学科分类号
摘要
Acoustic emission (AE) is a nondestructive real-time monitoring technology, which has been proven to be a valid way of monitoring dynamic damage to materials. The classification and recognition methods of the AE signals of the rotor are mostly focused on machine learning. Considering that the huge success of deep learning technologies, where the Recurrent Neural Network (RNN) has been widely applied to sequential classification tasks and Convolutional Neural Network (CNN) has been widely applied to image recognition tasks. A novel three-streams neural network (TSANN) model is proposed in this paper to deal with fault detection tasks. Based on residual connection and attention mechanism, each stream of the model is able to learn the most informative representation from Mel Frequency Cepstrum Coefficient (MFCC), Tempogram, and short-time Fourier transform (STFT) spectral respectively. Experimental results show that, in comparison with traditional classification methods and single-stream CNN networks, TSANN achieves the best overall performance and the classification error rate is reduced by up to 50%, which demonstrates the availability of the model proposed. © 2023 CRL Publishing. All rights reserved.
引用
收藏
页码:2963 / 2974
页数:11
相关论文
共 50 条
  • [41] AHNNet: Human Activity Recognition Based on Hybrid Neural Network Combining Attention Mechanism
    Cao Y.
    Li H.
    Duan P.
    Wang F.
    Wang C.
    Hsi-An Chiao Tung Ta Hsueh/Journal of Xi'an Jiaotong University, 2021, 55 (05): : 123 - 132
  • [42] 4D attention-based neural network for EEG emotion recognition
    Xiao, Guowen
    Shi, Meng
    Ye, Mengwen
    Xu, Bowen
    Chen, Zhendi
    Ren, Quansheng
    COGNITIVE NEURODYNAMICS, 2022, 16 (04) : 805 - 818
  • [43] Speech Emotion Recognition Based on Multiple Acoustic Features and Deep Convolutional Neural Network
    Bhangale, Kishor
    Kothandaraman, Mohanaprasad
    ELECTRONICS, 2023, 12 (04)
  • [44] Experimental study on train axle fatigue crack acoustic emission signals recognition based on a one-dimensional convolutional neural network
    Lin, Li
    Zhou, Kanghui
    Li, Daguang
    Li, Kun
    Wang, Xinyu
    NONDESTRUCTIVE TESTING AND EVALUATION, 2024,
  • [45] Distribution Network Topology Identification Based on Attention Mechanism and Convolutional Neural Network
    Yang X.
    Jiang J.
    Liu F.
    Tian Y.
    Li F.
    Wu Y.
    Dianwang Jishu/Power System Technology, 2022, 46 (05): : 1672 - 1682
  • [46] Detail-attention convolutional neural network for meter recognition
    Dong Y.
    Liu X.
    Yuan Y.
    Sui S.
    Ding H.
    Zhongguo Kexue Jishu Kexue/Scientia Sinica Technologica, 2020, 50 (11): : 1437 - 1448
  • [47] Performing protein fold recognition by exploiting a stack convolutional neural network with the attention mechanism
    Han, Ke
    Liu, Yan
    Xu, Jian
    Song, Jiangning
    Yu, Dong-Jun
    ANALYTICAL BIOCHEMISTRY, 2022, 651
  • [48] Classification of Located Acoustic Emission Events Using Neural Network
    Manthei, Gerd
    Guckert, Michael
    JOURNAL OF NONDESTRUCTIVE EVALUATION, 2023, 42 (01)
  • [49] Classification of Located Acoustic Emission Events Using Neural Network
    Gerd Manthei
    Michael Guckert
    Journal of Nondestructive Evaluation, 2023, 42
  • [50] Attention induced multi-head convolutional neural network for human activity recognition
    Khan, Zanobya N.
    Ahmad, Jamil
    APPLIED SOFT COMPUTING, 2021, 110