A Noise-aware Deep Learning Model for Automatic Modulation Recognition in Radar Signals

被引:6
作者
Aslinezhad, M. [1 ]
Sezavar, A. [2 ]
Malekijavan, A. [1 ]
机构
[1] Shahid Sattari Aeronaut Univ Sci & Technol, Dept Elect Engn, Tehran, Iran
[2] Univ Birjand, Dept Elect & Comp Engn, Birjand, Iran
来源
INTERNATIONAL JOURNAL OF ENGINEERING | 2023年 / 36卷 / 08期
关键词
Modulation Classification; Deep Learning; Noise-aware Systems; CONVOLUTIONAL NEURAL-NETWORKS; CNN; CLASSIFICATION;
D O I
10.5829/ije.2023.36.08b.06
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Automatic waveform recognition has become an important task in radar systems and spread spectrum communications. Identifying the modulation of received signals helps to recognize different invader transmitters. In this paper, a noise aware model is proposed to recognize the modulation type based on time-frequency characteristics. To this end, Choi-Williams representation is used to obtain spatial 2D pattern of received signal. After that, a deep model is constructed to make signal clear from noise and extract robust and discriminative features from time-frequency pattern, based on auto-encoder and Convolutional Neural Networks (CNN). In order to reduce the effect of noise and adversarial disorders, a new database of different modulation patterns with different AWGN noises and fading Rayleigh channel is created which helps model to avoid the effects of noise on modulation recognition. Our database contains radar modulations such as Barker, LFM, Costas and Frank code which are known as frequently used modulations on wireless communication. Infact, the main novelty of this work is designing this database and proposing noise-aware model. Experimental results demonstrate that the proposed model achieves superior performance for automatic classification recognition with 99.24% of accuracy in noisy medium with minimum SNR of-5dB while the accuracy is 97.90% in SNR of-5dB and f=15 Hz of Doppler frequency. Our model outperforms 5.54% in negative and 0.4% in positive SNRs (even though with less SNR).doi: 10.5829/ije.2023.36.08b.06
引用
收藏
页码:1459 / 1467
页数:9
相关论文
共 31 条
[1]   Automatic modulation classification based on high order cumulants and hierarchical polynomial classifiers [J].
Abdelmutalab, Ameen ;
Assaleh, Khaled ;
El-Tarhuni, Mohamed .
PHYSICAL COMMUNICATION, 2016, 21 :10-18
[2]  
[Anonymous], 2008, P 25 INT C MACH LEAR, DOI DOI 10.1145/1390156.1390294
[3]   Automatic Modulation Classification Using Combination of Genetic Programming and KNN [J].
Aslam, Muhammad Waqar ;
Zhu, Zhechen ;
Nandi, Asoke Kumar .
IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2012, 11 (08) :2742-2750
[4]   Similarity Learning with Spatial Constraints for Person Re-identification [J].
Chen, Dapeng ;
Yuan, Zejian ;
Chen, Badong ;
Zheng, Nanning .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :1268-1277
[5]   Recent advances in time-frequency analysis methods for machinery fault diagnosis: A review with application examples [J].
Feng, Zhipeng ;
Liang, Ming ;
Chu, Fulei .
MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2013, 38 (01) :165-205
[6]   Estimation of Hand Skeletal Postures by Using Deep Convolutional Neural Networks [J].
Gheitasi, A. ;
Farsi, H. ;
Mohamadzadeh, S. .
INTERNATIONAL JOURNAL OF ENGINEERING, 2020, 33 (04) :552-559
[7]   Deep learning for visual understanding: A review [J].
Guo, Yanming ;
Liu, Yu ;
Oerlemans, Ard ;
Lao, Songyang ;
Wu, Song ;
Lew, Michael S. .
NEUROCOMPUTING, 2016, 187 :27-48
[8]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778
[9]   CNN-Based Automatic Modulation Classification for Beyond 5G Communications [J].
Hermawan, Ade Pitra ;
Ginanjar, Rizki Rivai ;
Kim, Dong-Seong ;
Lee, Jae-Min .
IEEE COMMUNICATIONS LETTERS, 2020, 24 (05) :1038-1041
[10]   MCNet: An Efficient CNN Architecture for Robust Automatic Modulation Classification [J].
Huynh-The, Thien ;
Hua, Cam-Hao ;
Pham, Quoc-Viet ;
Kim, Dong-Seong .
IEEE COMMUNICATIONS LETTERS, 2020, 24 (04) :811-815