Collaborative Knowledge Distillation

被引:1
|
作者
Zhang, Weiwei [1 ]
Guo, Yufeng [1 ]
Wang, Junhuang [1 ]
Zhu, Jianqing [1 ]
Zeng, Huanqiang [1 ]
机构
[1] Huaqiao Univ, Coll Engn, Quanzhou 362021, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Knowledge engineering; Training; Feature extraction; Uncertainty; Correlation; Collaboration; Circuits and systems; Knowledge distillation; teacher-student learning; deep learning;
D O I
10.1109/TCSVT.2024.3377251
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Existing research on knowledge distillation has primarily concentrated on the task of facilitating student networks in acquiring the complete knowledge imparted by teacher networks. However, recent studies have shown that good networks are not suitable for acting as teachers, and there is a positive correlation between distillation performance and teacher prediction uncertainty. To address this finding, this paper thoroughly analyzes in depth the reasons why the teacher network affects the distillation performance, gives full play to the participation of the student network in the process of knowledge distillation, and assists the teacher network in distilling the knowledge that is suitable for their learning. In light of this premise, a novel approach known as Collaborative Knowledge Distillation (CKD) is introduced, which is founded upon the concept of "Tailoring the Teaching to the Individual". Compared with Baseline, this paper's method improves students' accuracy by an average of 3.42% in CIFAR-100 experiments, and by an average of 1.71% compared with the classical Knowledge Distillation (KD) method. The ImageNet experiments conducted revealed a significant improvement of 2.04% in the Top-1 accuracy of the students.
引用
收藏
页码:7601 / 7613
页数:13
相关论文
共 50 条
  • [21] Multilevel Attention-Based Sample Correlations for Knowledge Distillation
    Gou, Jianping
    Sun, Liyuan
    Yu, Baosheng
    Wan, Shaohua
    Ou, Weihua
    Yi, Zhang
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (05) : 7099 - 7109
  • [22] Quantifying the Knowledge in a DNN to Explain Knowledge Distillation for Classification
    Zhang, Quanshi
    Cheng, Xu
    Chen, Yilan
    Rao, Zhefan
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (04) : 5099 - 5113
  • [23] Knowledge Distillation via Multi-Teacher Feature Ensemble
    Ye, Xin
    Jiang, Rongxin
    Tian, Xiang
    Zhang, Rui
    Chen, Yaowu
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 566 - 570
  • [24] Dynamic Micro-Expression Recognition Using Knowledge Distillation
    Sun, Bo
    Cao, Siming
    Li, Dongliang
    He, Jun
    Yu, Lejun
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2022, 13 (02) : 1037 - 1043
  • [25] CDFKD-MFS: Collaborative Data-Free Knowledge Distillation via Multi-Level Feature Sharing
    Hao, Zhiwei
    Luo, Yong
    Wang, Zhi
    Hu, Han
    An, Jianping
    IEEE TRANSACTIONS ON MULTIMEDIA, 2022, 24 : 4262 - 4274
  • [26] Teacher-student collaborative knowledge distillation for image classification
    Xu, Chuanyun
    Gao, Wenjian
    Li, Tian
    Bai, Nanlan
    Li, Gang
    Zhang, Yang
    APPLIED INTELLIGENCE, 2023, 53 (02) : 1997 - 2009
  • [27] Efficient Person Search via Expert-Guided Knowledge Distillation
    Zhang, Yaqing
    Li, Xi
    Zhang, Zhongfei
    IEEE TRANSACTIONS ON CYBERNETICS, 2021, 51 (10) : 5093 - 5104
  • [28] Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
    Yang, Chuanguang
    An, Zhulin
    Cai, Linhang
    Xu, Yongjun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 2094 - 2108
  • [29] Teacher-student collaborative knowledge distillation for image classification
    Chuanyun Xu
    Wenjian Gao
    Tian Li
    Nanlan Bai
    Gang Li
    Yang Zhang
    Applied Intelligence, 2023, 53 : 1997 - 2009
  • [30] One-Class Knowledge Distillation for Face Presentation Attack Detection
    Li, Zhi
    Cai, Rizhao
    Li, Haoliang
    Lam, Kwok-Yan
    Hu, Yongjian
    Kot, Alex C.
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2022, 17 : 2137 - 2150