A Virtual Knowledge Distillation via Conditional GAN

被引:6
|
作者
Kim, Sihwan [1 ]
机构
[1] Hana Inst Technol, Big Data & AI Lab, Seoul 06133, South Korea
来源
IEEE ACCESS | 2022年 / 10卷
关键词
Training; Generators; Knowledge engineering; Bridges; Generative adversarial networks; Task analysis; Collaborative work; Image classification; model compression; knowledge distillation; self-knowledge distillation; collaborative learning; conditional generative adversarial network;
D O I
10.1109/ACCESS.2022.3163398
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Knowledge distillation aims at transferring the knowledge from a pre-trained complex model, called teacher, to a relatively smaller and faster one, called student. Unlike previous works that transfer the teacher's softened distributions or feature spaces, in this paper, we propose a novel approach, called Virtual Knowledge Distillation (VKD), that transfers a softened distribution generated by a virtual knowledge generator conditioned on class label. A virtual knowledge generator is trained independently, but concurrently with a teacher, to mimic the teacher's softened distributions. Afterwards, when training a student, virtual knowledge generator can be exploited instead of the teacher's softened distributions or combined with the existing distillation methods in a straightforward manner. Moreover, with slight modifications, VKD can be utilized not only for the self-knowledge distillation method but also for the collaborative learning method. We compare our method with several representative distillation methods in various combinations of teacher and student architectures on the image classification tasks. Experimental results on various image classification tasks demonstrate that VKD show a competitive performance compared to the conventional distillation methods, and when combined with them, the performance is improved with a substantial margin.
引用
收藏
页码:34766 / 34778
页数:13
相关论文
共 50 条
  • [21] Knowledge Distillation in Acoustic Scene Classification
    Jung, Jee-Weon
    Heo, Hee-Soo
    Shim, Hye-Jin
    Yu, Ha-Jin
    IEEE ACCESS, 2020, 8 : 166870 - 166879
  • [22] Knowledge distillation via Noisy Feature Reconstruction
    Shi, Chaokun
    Hao, Yuexing
    Li, Gongyan
    Xu, Shaoyun
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 257
  • [23] Stereo Confidence Estimation via Locally Adaptive Fusion and Knowledge Distillation
    Kim, Sunok
    Kim, Seungryong
    Min, Dongbo
    Frossard, Pascal
    Sohn, Kwanghoon
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (05) : 6372 - 6385
  • [24] Nickel and Diming Your GAN: A Dual-Method Approach to Enhancing GAN Efficiency via Knowledge Distillation
    Yeo, Sangyeop
    Jang, Yoojin
    Yoo, Jaejun
    COMPUTER VISION - ECCV 2024, PT LXXXVIII, 2025, 15146 : 104 - 121
  • [25] Self-knowledge distillation via dropout
    Lee, Hyoje
    Park, Yeachan
    Seo, Hyun
    Kang, Myungjoo
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2023, 233
  • [26] Conditional Activation GAN: Improved Auxiliary Classifier GAN
    Cho, Jeongik
    Yoon, Kyoungro
    IEEE ACCESS, 2020, 8 (08): : 216729 - 216740
  • [27] Lightweight Intrusion Detection System with GAN-based Knowledge Distillation
    Ali, Tarek
    Eleyan, Amna
    Bejaoui, Tarek
    Al-Khalidi, Mohammed
    2024 INTERNATIONAL CONFERENCE ON SMART APPLICATIONS, COMMUNICATIONS AND NETWORKING, SMARTNETS-2024, 2024,
  • [28] GAN-Knowledge Distillation for One-Stage Object Detection
    Wang, Wanwei
    Hong, Wei
    Wang, Feng
    Yu, Jinke
    IEEE ACCESS, 2020, 8 : 60719 - 60727
  • [29] Collaborative Knowledge Distillation
    Zhang, Weiwei
    Guo, Yufeng
    Wang, Junhuang
    Zhu, Jianqing
    Zeng, Huanqiang
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (08) : 7601 - 7613
  • [30] Logitwise Distillation Network: Improving Knowledge Distillation via Introducing Sample Confidence
    Shen, Teng
    Cui, Zhenchao
    Qi, Jing
    APPLIED SCIENCES-BASEL, 2025, 15 (05):