An Aerial Target Recognition Algorithm Based on Self-Attention and LSTM

被引:0
|
作者
Liang, Futai [1 ,2 ]
Chen, Xin [1 ]
He, Song [1 ]
Song, Zihao [1 ]
Lu, Hao [3 ]
机构
[1] Early Warning Acad, Dept Intelligence, Wuhan 430019, Peoples R China
[2] 31121 PLA Troops, Nanjing 210000, Peoples R China
[3] Early Warning Acad, Informat Technol Room, Wuhan 430019, Peoples R China
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2024年 / 81卷 / 01期
关键词
Aerial target recognition; long short-term memory network; self-attention; three-point estimation; NETWORKS;
D O I
10.32604/cmc.2024.055326
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the application of aerial target recognition, on the one hand, the recognition error produced by the single measurement of the sensor is relatively large due to the impact of noise. On the other hand, it is difficult to apply machine learning methods to improve the intelligence and recognition effect due to few or no actual measurement samples. Aiming at these problems, an aerial target recognition algorithm based on self-attention and Long Short-Term Memory Network (LSTM) is proposed. LSTM can effectively extract temporal dependencies. The attention mechanism calculates the weight of each input element and applies the weight to the hidden state of the LSTM, thereby adjusting the LSTM's attention to the input. This combination retains the learning ability of LSTM and introduces the advantages of the attention mechanism, making the model have stronger feature extraction ability and adaptability when processing sequence data. In addition, based on the prior information of the multidimensional characteristics of the target, the three-point estimation method is adopted to simulate an aerial target recognition dataset to train the recognition model. The experimental results show that the proposed algorithm achieves more than 91% recognition accuracy, lower false alarm rate and higher robustness compared with the multi-attribute decision-making (MADM) based on fuzzy numbers.
引用
收藏
页码:1101 / 1121
页数:21
相关论文
共 50 条
  • [1] SSD image target detection algorithm based on self-attention
    Chu Y.
    Huang Y.
    Zhang X.
    Liu H.
    1600, Huazhong University of Science and Technology (48): : 70 - 75
  • [2] Cyclic Self-attention for Point Cloud Recognition
    Zhu, Guanyu
    Zhou, Yong
    Yao, Rui
    Zhu, Hancheng
    Zhao, Jiaqi
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2023, 19 (01)
  • [3] Finger Vein Recognition Based on ResNet With Self-Attention
    Zhang, Zhibo
    Chen, Guanghua
    Zhang, Weifeng
    Wang, Huiyang
    IEEE ACCESS, 2024, 12 : 1943 - 1951
  • [4] Object Detection Algorithm Based on Context Information and Self-Attention Mechanism
    Liang, Hong
    Zhou, Hui
    Zhang, Qian
    Wu, Ting
    SYMMETRY-BASEL, 2022, 14 (05):
  • [5] Hashtag Recommendation Using LSTM Networks with Self-Attention
    Shen, Yatian
    Li, Yan
    Sun, Jun
    Ding, Wenke
    Shi, Xianjin
    Zhang, Lei
    Shen, Xiajiong
    He, Jing
    CMC-COMPUTERS MATERIALS & CONTINUA, 2019, 61 (03): : 1261 - 1269
  • [6] A Self-attention Network for Face Detection Based on Unmanned Aerial Vehicles
    Hua, Shunfu
    Fan, Huijie
    Ding, Naida
    Li, Wei
    Tang, Yandong
    INTELLIGENT ROBOTICS AND APPLICATIONS (ICIRA 2022), PT II, 2022, 13456 : 440 - 449
  • [7] Self-attention for Speech Emotion Recognition
    Tarantino, Lorenzo
    Garner, Philip N.
    Lazaridis, Alexandros
    INTERSPEECH 2019, 2019, : 2578 - 2582
  • [8] Grain protein function prediction based on self-attention mechanism and bidirectional LSTM
    Liu, Jing
    Tang, Xinghua
    Guan, Xiao
    BRIEFINGS IN BIOINFORMATICS, 2023, 24 (01)
  • [9] GCN-Based LSTM Autoencoder with Self-Attention for Bearing Fault Diagnosis
    Lee, Daehee
    Choo, Hyunseung
    Jeong, Jongpil
    SENSORS, 2024, 24 (15)
  • [10] Self-Attention based Siamese Neural Network recognition Model
    Liu, Yuxing
    Chang, Geng
    Fu, Guofeng
    Wei, Yingchao
    Lan, Jie
    Liu, Jiarui
    2022 34TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2022, : 721 - 724