Zero-Shot Modulation Recognition via Knowledge-Informed Waveform Description

被引:0
|
作者
Chen, Ying [1 ]
Wang, Xiang [1 ]
Huang, Zhitao [1 ,2 ]
机构
[1] Natl Univ Def Technol, State Key Lab Complex Electromagnet Environm Effec, Changsha 410000, Peoples R China
[2] Natl Univ Def Technol, Coll Elect Engn, Hefei 230000, Peoples R China
基金
中国国家自然科学基金;
关键词
Semantics; Modulation; Training; Symmetric matrices; Vectors; Zero shot learning; Symbols; Visualization; Receivers; Laboratories; Automatic modulation recognition; knowledge and data joint-driven learning; zero-shot learning; graph neural networks; NETWORK;
D O I
10.1109/LSP.2024.3491013
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In non-cooperative environments, deep learning-based automatic modulation recognition techniques often struggle with the situations with insufficient or even no training data accessible. In this letter, we investigate this problem in the amplitude-phase-modulation recognition task and introduce a knowledge-informed waveform description for zero-shot recognition generalization. Specifically, drawing inspiration from constellation association knowledge, we define a constellation-based semantic attribute set to describe waveform structures and employ graph formulation to model attributes' symmetric dependency for improving representations. Subsequently, we align the waveform and semantic spaces by associating waveform and attribute compositional representations, facilitating the transfer of knowledge from the seen to unseen domain. Our scheme can reason the labels of unseen waveform types with the guidance of the attribute description outputting, beyond merely distinguishing test instances as unseen. Experiments validate the efficacy of the proposed method across few-shot and zero-shot recognition tasks.
引用
收藏
页码:21 / 25
页数:5
相关论文
共 50 条
  • [41] Knowledge Distillation Classifier Generation Network for Zero-Shot Learning
    Yu, Yunlong
    Li, Bin
    Ji, Zhong
    Han, Jungong
    Zhang, Zhongfei
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (06) : 3183 - 3194
  • [42] Zero-Shot Text Normalization via Cross-Lingual Knowledge Distillation
    Wang, Linqin
    Huang, Xiang
    Yu, Zhengtao
    Peng, Hao
    Gao, Shengxiang
    Mao, Cunli
    Huang, Yuxin
    Dong, Ling
    Yu, Philip S.
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 4631 - 4646
  • [43] Transductive Learning With Prior Knowledge for Generalized Zero-Shot Action Recognition
    Su, Taiyi
    Wang, Hanli
    Qi, Qiuping
    Wang, Lei
    He, Bin
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (01) : 260 - 273
  • [44] Hand and Pose-Based Feature Selection for Zero-Shot Sign Language Recognition
    Ozcan, Giray Sercan
    Bilge, Yunus Can
    Sumer, Emre
    IEEE ACCESS, 2024, 12 : 107757 - 107768
  • [45] Zero-Shot Learning via Visual Abstraction
    Antol, Stanislaw
    Zitnick, C. Lawrence
    Parikh, Devi
    COMPUTER VISION - ECCV 2014, PT IV, 2014, 8692 : 401 - 416
  • [46] Masked Autoencoder via End-to-End Zero-Shot Learning for Fault Diagnosis of Unseen Classes
    Long, Jianyu
    Lin, Jing
    Jiang, Lingli
    Yang, Zhe
    Guo, Jianwen
    Yin, Tao
    Li, Chuan
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73
  • [47] Zero-Shot Hashing via Asymmetric Ratio Similarity Matrix
    Shi, Yang
    Nie, Xiushan
    Liu, Xingbo
    Yang, Lu
    Yin, Yilong
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (05) : 5426 - 5437
  • [48] Zero-Shot Visual Emotion Recognition by Exploiting BERT
    Kang, Hyunwook
    Hazarika, Devamanyu
    Kim, Dongho
    Kim, Jihie
    INTELLIGENT SYSTEMS AND APPLICATIONS, VOL 2, 2023, 543 : 485 - 494
  • [49] Zero-shot recognition with latent visual attributes learning
    Xie, Yurui
    He, Xiaohai
    Zhang, Jing
    Luo, Xiaodong
    MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (37-38) : 27321 - 27335
  • [50] Group-wise interactive region learning for zero-shot recognition
    Guo, Ting
    Liang, Jiye
    Xie, Guo-Sen
    INFORMATION SCIENCES, 2023, 642