Meta-Learning Guided Label Noise Distillation for Robust Signal Modulation Classification

被引:3
作者
Hao, Xiaoyang [1 ]
Feng, Zhixi [1 ]
Peng, Tongqing [1 ]
Yang, Shuyuan [1 ]
机构
[1] Xidian Univ, Sch Artificial Intelligence, Key Lab Intelligent Percept & Image Understanding, Minist Educ, Xian 710071, Peoples R China
来源
IEEE INTERNET OF THINGS JOURNAL | 2025年 / 12卷 / 01期
基金
中国国家自然科学基金;
关键词
Noise; Robustness; Training; Noise measurement; Modulation; Industrial Internet of Things; Accuracy; Automatic modulation classification (AMC); few-shot trusted labeled samples; label noise; meta-learning; multiview signal (MVS); RECOGNITION; NETWORKS; FEATURES;
D O I
10.1109/JIOT.2024.3462544
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Automatic modulation classification (AMC) has a wide range of applications in both civilian and military fields, such as industrial Internet of Things (IIoT) security, communication spectrum management, and military electronic countermeasures. However, label mislabeling often occurs in practical scenarios, significantly impacting the performance and robustness of deep neural networks (DNNs). In this article, we propose a meta-learning guided label noise distillation method to enhance the robustness of AMC models against label noise or errors. Specifically, we propose a teacher-student heterogeneous network (TSHN) to discriminate and distill label noise. Following the notion that labels represent information, a teacher network, utilizing trusted few-shot labeled samples, reevaluates and corrects labels for a considerable number of untrusted labeled samples through meta-learning. By dividing and conquering untrusted labeled samples according to their confidence levels, the student network learns more effectively. Additionally, we propose a multiview signal (MVS) method to further enhance the performance of hard-to-classify categories with few-shot trusted labeled samples. Extensive experiments on the RadioML2016 and HisarMod2019.1 data sets demonstrate that our methods significantly improve accuracy and robustness in signal AMC across diverse label noise scenarios, including symmetric, asymmetric, and mixed label noise. For example, compared to the baseline convolutional neural network with the cross-entropy loss, our proposed TSHN achieves a remarkable 1.26% to 36.84% accuracy improvement under symmetric label noise and 0.12% to 38.59% accuracy improvement under mixed label noise. Moreover, TSHN exhibits greater robustness to varying label noise rates compared to existing methods.
引用
收藏
页码:402 / 418
页数:17
相关论文
共 55 条
  • [1] Achieving Efficient Feature Representation for Modulation Signal: A Cooperative Contrast Learning Approach
    Bai, Jing
    Wang, Xu
    Xiao, Zhu
    Zhou, Huaji
    Ali, Talal Ahmed Ali
    Li, You
    Jiao, Licheng
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (09): : 16196 - 16211
  • [2] Adversarial Transfer Learning for Deep Learning Based Automatic Modulation Classification
    Bu, Ke
    He, Yuan
    Jing, Xiaojun
    Han, Jindong
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 880 - 884
  • [3] Signal Modulation Classification Based on the Transformer Network
    Cai, Jingjing
    Gan, Fengming
    Cao, Xianghai
    Liu, Wei
    [J]. IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING, 2022, 8 (03) : 1348 - 1357
  • [4] A tera-electron volt afterglow from a narrow jet in an extremely bright gamma-ray burst
    Cao, Zhen
    Aharonian, F.
    An, Q.
    Axikegu
    Bai, L. X.
    Bai, Y. X.
    Bao, Y. W.
    Bastieri, D.
    Bi, X. J.
    Bi, Y. J.
    Cai, J. T.
    Cao, Q.
    Cao, W. Y.
    Cao, Zhe
    Chang, J.
    Chang, J. F.
    Chen, E. S.
    Chen, Liang
    Chen, Lin
    Chen, Long
    Chen, M. J.
    Chen, M. L.
    Chen, Q. H.
    Chen, S. H.
    Chen, S. Z.
    Chen, T. L.
    Chen, Y.
    Cheng, H. L.
    Cheng, N.
    Cheng, Y. D.
    Cui, S. W.
    Cui, X. H.
    Cui, Y. D.
    Dai, B. Z.
    Dai, H. L.
    Dai, Z. G.
    Danzengluobu
    della Volpe, D.
    Dong, X. Q.
    Duan, K. K.
    Fan, J. H.
    Fan, Y. Z.
    Fang, J.
    Fang, K.
    Feng, C. F.
    Feng, L.
    Feng, S. H.
    Feng, X. T.
    Feng, Y. L.
    Gao, B.
    [J]. SCIENCE, 2023, 380 (6652) : 1390 - 1396
  • [5] Automatic Modulation Classification Scheme Based on LSTM With Random Erasing and Attention Mechanism
    Chen, Yufan
    Shao, Wei
    Liu, Jin
    Yu, Lu
    Qian, Zuping
    [J]. IEEE ACCESS, 2020, 8 (08): : 154290 - 154300
  • [6] SR2CNN: Zero-Shot Learning for Signal Recognition
    Dong, Yihong
    Jiang, Xiaohan
    Zhou, Huaji
    Lin, Yun
    Shi, Qingjiang
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 2316 - 2329
  • [7] Ekti A. R., 2020, P IEEE 91 VEH TECHN, P1
  • [8] Classification in the Presence of Label Noise: a Survey
    Frenay, Benoit
    Verleysen, Michel
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (05) : 845 - 869
  • [9] Ghosh A, 2017, AAAI CONF ARTIF INTE, P1919
  • [10] MCformer: A Transformer Based Deep Neural Network for Automatic Modulation Classification
    Hamidi-Rad, Shahab
    Jain, Swayambhoo
    [J]. 2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,