Contrastive Prototype-Guided Generation for Generalized Zero-Shot Learning

被引:3
|
作者
Wang, Yunyun [1 ]
Mao, Jian [1 ]
Guo, Chenguang [1 ]
Chen, Songcan [2 ]
机构
[1] Nanjing Univ Posts & Telecommun, Nanjing, Peoples R China
[2] Nanjing Univ Aeronaut & Astronaut, Nanjing, Peoples R China
关键词
Zero-shot learning; Generative adversarial network; Contrastive prototype; Feature diversity;
D O I
10.1016/j.neunet.2024.106324
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Generalized zero -shot learning (GZSL) aims to recognize both seen and unseen classes, while only samples from seen classes are available for training. The mainstream methods mitigate the lack of unseen training data by simulating the visual unseen samples. However, the sample generator is actually learned with just seen -class samples, and semantic descriptions of unseen classes are just provided to the pre -trained sample generator for unseen data generation, therefore, the generator would have bias towards seen categories, and the unseen generation quality, including both precision and diversity, is still the main learning challenge. To this end, we propose a Prototype -Guided Generation for Generalized Zero -Shot Learning (PGZSL), in order to guide the sample generation with unseen knowledge. First, unseen data generation is guided and rectified in PGZSL by contrastive prototypical anchors with both class semantic consistency and feature discriminability. Second, PGZSL introduces Certainty -Driven Mixup for generator to enrich the diversity of generated unseen samples, while suppress the generation of uncertain boundary samples as well. Empirical results over five benchmark datasets show that PGZSL significantly outperforms the SOTA methods in both ZSL and GZSL tasks.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Dual Prototype Contrastive Network for Generalized Zero-Shot Learning
    Jiang, Huajie
    Li, Zhengxian
    Hu, Yongli
    Yin, Baocai
    Yang, Jian
    van den Hengel, Anton
    Yang, Ming-Hsuan
    Qi, Yuankai
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2025, 35 (02) : 1111 - 1122
  • [2] Generation-based contrastive model with semantic alignment for generalized zero-shot learning
    Yang, Jingqi
    Shen, Qi
    Xie, Cheng
    IMAGE AND VISION COMPUTING, 2023, 137
  • [3] Prototype rectification for zero-shot learning
    Yi, Yuanyuan
    Zeng, Guolei
    Ren, Bocheng
    Yang, Laurence T.
    Chai, Bin
    Li, Yuxin
    PATTERN RECOGNITION, 2024, 156
  • [4] Residual-Prototype Generating Network for Generalized Zero-Shot Learning
    Zhang, Zeqing
    Li, Xiaofan
    Ma, Tai
    Gao, Zuodong
    Li, Cuihua
    Lin, Weiwei
    MATHEMATICS, 2022, 10 (19)
  • [5] Class-Prototype Discriminative Network for Generalized Zero-Shot Learning
    Huang, Sheng
    Lin, Jingkai
    Huangfu, Luwen
    IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 301 - 305
  • [6] Learning MLatent Representations for Generalized Zero-Shot Learning
    Ye, Yalan
    Pan, Tongjie
    Luo, Tonghoujun
    Li, Jingjing
    Shen, Heng Tao
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 2252 - 2265
  • [7] Towards Discriminative Feature Generation for Generalized Zero-Shot Learning
    Ge, Jiannan
    Xie, Hongtao
    Li, Pandeng
    Xie, Lingxi
    Min, Shaobo
    Zhang, Yongdong
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 10514 - 10529
  • [8] Adaptive Margin-based Contrastive Network for Generalized Zero-Shot Learning
    Lee, Jeong-Cheol
    Shibu, Athul
    Lee, Dong-Gyu
    2023 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS, ICCE, 2023,
  • [9] Zero-shot classification with unseen prototype learning
    Ji, Zhong
    Cui, Biying
    Yu, Yunlong
    Pang, Yanwei
    Zhang, Zhongfei
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (17) : 12307 - 12317
  • [10] Hierarchical Prototype Learning for Zero-Shot Recognition
    Zhang, Xingxing
    Gui, Shupeng
    Zhu, Zhenfeng
    Zhao, Yao
    Liu, Ji
    IEEE TRANSACTIONS ON MULTIMEDIA, 2020, 22 (07) : 1692 - 1703