Efficient convolutional dual-attention transformer for automatic modulation recognition

被引:0
|
作者
Yi, Zengrui [1 ]
Meng, Hua [1 ]
Gao, Lu [2 ]
He, Zhonghang [2 ]
Yang, Meng [1 ]
机构
[1] Southwest Jiaotong Univ, Sch Math, Chengdu 611756, Sichuan, Peoples R China
[2] Natl Key Lab Sci & Technol Test Phys & Numeral Mat, Beijing, Peoples R China
关键词
Efficient modulation recognition; Lightweight convolution; Dual-attention mechanism; Transformer; CLASSIFICATION;
D O I
10.1007/s10489-024-06202-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Automatic modulation recognition (AMR) involves identifying the modulation of electromagnetic signals in a noncollaborative manner. Deep learning-based methods have become a focused research topic in the AMR field. Such models are frequently trained using standardized data, relying on many computational and storage resources. However, in real-world applications, the finite resources of edge devices limit the deployment of large-scale models. In addition, traditional networks cannot handle real-world signals of varying lengths and local missing data. Thus, we propose a network structure based on a convolutional Transformer with a dual-attention mechanism. This proposed structure effectively utilizes the inductive bias of the lightweight convolution and the global property of the Transformer model, thereby fusing local features with global features to get high recognition accuracy. Moreover, the model can adapt to the length of the input signals while maintaining strong robustness against incomplete signals. Experimental results on the open-source datasets RML2016.10a, RML2016.10b, and RML2018.01a demonstrate that the proposed network structure can achieve 95.05%, 94.79%, and 98.14% accuracy, respectively, with enhancement training and maintain greater than 90% accuracy when the signals are incomplete. In addition, the proposed network structure has fewer parameters and lower computational cost than benchmark methods.
引用
收藏
页数:16
相关论文
共 50 条
  • [21] Squeezeformer: An Efficient Transformer for Automatic Speech Recognition
    Kim, Sehoon
    Gholami, Amir
    Shaw, Albert
    Lee, Nicholas
    Mangalam, Karttikeya
    Malik, Jitendra
    Mahoney, Michael W.
    Keutzer, Kurt
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [22] A Text Recognition Algorithm Based on a Dual-Attention Mechanism in Complex Driving Environment
    Ding, Ling
    Wang, Liyuan
    Wang, Yuanfang
    Yu, Shaohuai
    Xiao, Jinsheng
    TEHNICKI VJESNIK-TECHNICAL GAZETTE, 2024, 31 (01): : 247 - 253
  • [23] A Complex-Valued Transformer for Automatic Modulation Recognition
    Li, Weihao
    Deng, Wen
    Wang, Keren
    You, Ling
    Huang, Zhitao
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (12): : 22197 - 22207
  • [24] Unifying Dual-Attention and Siamese Transformer Network for Full-Reference Image Quality Assessment
    Tang, Zhenjun
    Chen, Zhiyuan
    Li, Zhixin
    Zhong, Bineng
    Zhang, Xianquan
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2023, 19 (06)
  • [25] Automatic Prediction of Multiple Associated Diseases Using a Dual-Attention Neural Network Model
    Ren, Yafeng
    Wang, Zilin
    Tong, Andwei
    HEALTH INFORMATION PROCESSING, CHIP 2023, 2023, 1993 : 369 - 391
  • [26] Human Motion Prediction via Dual-Attention and Multi-Granularity Temporal Convolutional Networks
    Huang, Biaozhang
    Li, Xinde
    SENSORS, 2023, 23 (12)
  • [27] Prediction of mechanical properties of rolled steel based on dual-attention multiscale convolutional neural network
    Zhang, Qiwen
    Wu, Wenkui
    Tang, Xingchang
    Jin, Mingzhu
    MATERIALS TODAY COMMUNICATIONS, 2024, 41
  • [28] Few-shot learning based on dual-attention mechanism for orchid species recognition
    Lee, Shih-Hsiung
    Ku, Hsuan-Chih
    Zhang, Ya-Ci
    INTERNATIONAL JOURNAL OF DATA SCIENCE AND ANALYTICS, 2024,
  • [29] Social Network Rumor Detection Method Combining Dual-Attention Mechanism With Graph Convolutional Network
    Liu, Xiaoyang
    Zhao, Zhengyang
    Zhang, Yihao
    Liu, Chao
    Yang, Fan
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2023, 10 (05) : 2350 - 2361
  • [30] A Novel Attention Cooperative Framework for Automatic Modulation Recognition
    Chen, Shiyao
    Zhang, Yan
    He, Zunwen
    Nie, Jinbo
    Zhang, Wancheng
    IEEE ACCESS, 2020, 8 : 15673 - 15686