Efficient convolutional dual-attention transformer for automatic modulation recognition

被引:0
|
作者
Yi, Zengrui [1 ]
Meng, Hua [1 ]
Gao, Lu [2 ]
He, Zhonghang [2 ]
Yang, Meng [1 ]
机构
[1] Southwest Jiaotong Univ, Sch Math, Chengdu 611756, Sichuan, Peoples R China
[2] Natl Key Lab Sci & Technol Test Phys & Numeral Mat, Beijing, Peoples R China
关键词
Efficient modulation recognition; Lightweight convolution; Dual-attention mechanism; Transformer; CLASSIFICATION;
D O I
10.1007/s10489-024-06202-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Automatic modulation recognition (AMR) involves identifying the modulation of electromagnetic signals in a noncollaborative manner. Deep learning-based methods have become a focused research topic in the AMR field. Such models are frequently trained using standardized data, relying on many computational and storage resources. However, in real-world applications, the finite resources of edge devices limit the deployment of large-scale models. In addition, traditional networks cannot handle real-world signals of varying lengths and local missing data. Thus, we propose a network structure based on a convolutional Transformer with a dual-attention mechanism. This proposed structure effectively utilizes the inductive bias of the lightweight convolution and the global property of the Transformer model, thereby fusing local features with global features to get high recognition accuracy. Moreover, the model can adapt to the length of the input signals while maintaining strong robustness against incomplete signals. Experimental results on the open-source datasets RML2016.10a, RML2016.10b, and RML2018.01a demonstrate that the proposed network structure can achieve 95.05%, 94.79%, and 98.14% accuracy, respectively, with enhancement training and maintain greater than 90% accuracy when the signals are incomplete. In addition, the proposed network structure has fewer parameters and lower computational cost than benchmark methods.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Dual-attention transformer-based hybrid network for multi-modal medical image segmentation
    Zhang, Menghui
    Zhang, Yuchen
    Liu, Shuaibing
    Han, Yahui
    Cao, Honggang
    Qiao, Bingbing
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [42] A Lightweight Transformer with Convolutional Attention
    Zeng, Kungan
    Paik, Incheon
    2020 11TH INTERNATIONAL CONFERENCE ON AWARENESS SCIENCE AND TECHNOLOGY (ICAST), 2020,
  • [43] Spectrum Analysis and Convolutional Neural Network for Automatic Modulation Recognition
    Zeng, Yuan
    Zhang, Meng
    Han, Fei
    Gong, Yi
    Zhang, Jin
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2019, 8 (03) : 929 - 932
  • [44] DrugDAGT: a dual-attention graph transformer with contrastive learning improves drug-drug interaction prediction
    Chen, Yaojia
    Wang, Jiacheng
    Zou, Quan
    Niu, Mengting
    Ding, Yijie
    Song, Jiangning
    Wang, Yansu
    BMC BIOLOGY, 2024, 22 (01)
  • [45] A Dual-Attention Network for Joint Named Entity Recognition and Sentence Classification of Adverse Drug Events
    Wunnava, Susmitha
    Qin, Xiao
    Kakar, Tabassum
    Kong, Xiangnan
    Rundensteiner, Elke A.
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020,
  • [46] Mixed Attention and Channel Shift Transformer for Efficient Action Recognition
    Lu, Xiusheng
    Hao, Yanbin
    Cheng, Lechao
    Zhao, Sicheng
    Li, Yutao
    Song, Mingli
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2025, 21 (03)
  • [47] Efficient Dual Attention Transformer for Image Super-Resolution
    Park, Soobin
    Jeong, Yuna
    Choi, Yong Suk
    39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 963 - 970
  • [48] Convolutional Neural Network With Attention Mechanism for SAR Automatic Target Recognition
    Zhang, Ming
    An, Jubai
    Yu, Da Hua
    Yang, Li Dong
    Wu, Liang
    Lu, Xiao Qi
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [49] DCTN: Dual-Branch Convolutional Transformer Network With Efficient Interactive Self-Attention for Hyperspectral Image Classification
    Zhou, Yunfei
    Huang, Xiaohui
    Yang, Xiaofei
    Peng, Jiangtao
    Ban, Yifang
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 16
  • [50] DUAL-ATTENTION NETWORK FOR FEW-SHOT SEGMENTATION
    Chen, Zhikui
    Wang, Han
    Zhang, Suhua
    Zhong, Fangming
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 2210 - 2214