Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution

被引:6
|
作者
Yang, Chuanguang [1 ,2 ]
An, Zhulin [1 ]
Cai, Linhang [1 ,2 ]
Xu, Yongjun [1 ]
机构
[1] Chinese Acad Sci, Inst Comp Technol, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
关键词
Task analysis; Knowledge engineering; Self-supervised learning; Feature extraction; Probability distribution; Training; Semantics; Knowledge distillation; representation learning; self-supervised learning; visual recognition;
D O I
10.1109/TNNLS.2022.3186807
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge distillation (KD) is an effective framework that aims to transfer meaningful information from a large teacher to a smaller student. Generally, KD often involves how to define and transfer knowledge. Previous KD methods often focus on mining various forms of knowledge, for example, feature maps and refined information. However, the knowledge is derived from the primary supervised task, and thus, is highly task-specific. Motivated by the recent success of self-supervised representation learning, we propose an auxiliary self-supervision augmented task to guide networks to learn more meaningful features. Therefore, we can derive soft self-supervision augmented distributions as richer dark knowledge from this task for KD. Unlike previous knowledge, this distribution encodes joint knowledge from supervised and self-supervised feature learning. Beyond knowledge exploration, we propose to append several auxiliary branches at various hidden layers, to fully take advantage of hierarchical feature maps. Each auxiliary branch is guided to learn self-supervision augmented tasks and distill this distribution from teacher to student. Overall, we call our KD method a hierarchical self-supervision augmented KD (HSSAKD). Experiments on standard image classification show that both offline and online HSSAKD achieves state-of-the-art performance in the field of KD. Further transfer experiments on object detection further verify that HSSAKD can guide the network to learn better features. The code is available at https://github.com/winycg/HSAKD.
引用
收藏
页码:2094 / 2108
页数:15
相关论文
共 50 条
  • [1] Improving Audio Classification Method by Combining Self-Supervision with Knowledge Distillation
    Gong, Xuchao
    Duan, Hongjie
    Yang, Yaozhong
    Tan, Lizhuang
    Wang, Jian
    Vasilakos, Athanasios V.
    ELECTRONICS, 2024, 13 (01)
  • [2] Hierarchical Self-supervised Augmented Knowledge Distillation
    Yang, Chuanguang
    An, Zhulin
    Cai, Linhang
    Xu, Yongjun
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 1217 - 1223
  • [3] Boosting Self-Supervision for Single-View Scene Completion via Knowledge Distillation
    Han, Keonhee
    Muhle, Dominik
    Wimbauer, Felix
    Cremers, Daniel
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 9837 - 9847
  • [4] Self-distillation and self-supervision for partial label learning
    Yu, Xiaotong
    Sun, Shiding
    Tian, Yingjie
    PATTERN RECOGNITION, 2024, 146
  • [5] Fair Visual Recognition in Limited Data Regime using Self-Supervision and Self-Distillation
    Mazumder, Pratik
    Singh, Pravendra
    Namboodiri, Vinay P.
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 3889 - 3897
  • [6] THE FEASIBILITY OF SELF-SUPERVISION
    Hudelson, Earl
    JOURNAL OF EDUCATIONAL RESEARCH, 1952, 45 (05): : 335 - 347
  • [7] Unsupervised Discovery of the Long-Tail in Instance Segmentation Using Hierarchical Self-Supervision
    Weng, Zhenzhen
    Ogut, Mehmet Giray
    Limonchik, Shai
    Yeung, Serena
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 2603 - 2612
  • [8] Complementary Calibration: Boosting General Continual Learning With Collaborative Distillation and Self-Supervision
    Ji, Zhong
    Li, Jin
    Wang, Qiang
    Zhang, Zhongfei
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 657 - 667
  • [9] RS-SSKD: Self-Supervision Equipped with Knowledge Distillation for Few-Shot Remote Sensing Scene Classification
    Zhang, Pei
    Li, Ying
    Wang, Dong
    Wang, Jiyue
    SENSORS, 2021, 21 (05) : 1 - 23
  • [10] Self-supervision, surveillance and transgression
    Simon, Gail
    JOURNAL OF FAMILY THERAPY, 2010, 32 (03) : 308 - 325