Adaptive lightweight network construction method for Self-Knowledge Distillation

被引:0
|
作者
Lu, Siyuan [1 ]
Zeng, Weiliang [1 ]
Li, Xueshi [1 ]
Ou, Jiajun [1 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou 510006, Guangdong, Peoples R China
关键词
Deep learning; Knowledge Distillation; Neural network architecture design;
D O I
10.1016/j.neucom.2025.129477
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Self-Knowledge Distillation (self-KD) has become a promising method for neural network compression due to its advances in computational efficiency. Nevertheless, its applicability is constrained by the inherent inflexibility of the network architecture and the absence of quantitative metrics to evaluate the distillability of the architecture. To address these problems, a two-stage adaptive dynamic distillation network framework (ADDN) is proposed to adapt the architecture based on the distillability of the current architecture, containing a hypernetwork topology construction stage and a subnetwork training stage. To evaluate the distillability of current architectures without necessitating extensive training, we propose a set of low-cost distillability metrics that evaluate architectures from the perspective of architectural similarity and clustering ability. Furthermore, to simplify the hypernetwork structure and reduce the complexity of the construction process, a hierarchical filtering module is introduced to dynamically refine and remove candidate operations within the architecture incrementally, contingent upon the distillability of the current architecture. To validate the effectiveness of our approach, we conduct extensive experiments on various image classification datasets and compare with current works. Experimental results demonstrate that the self-knowledge distillation network architecture obtained by our proposed methodology simultaneously attains superior distillability and efficiency while significantly curtailing construction expenses.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] A Multi-Scale Convolutional Neural Network with Self-Knowledge Distillation for Bearing Fault Diagnosis
    Yu, Jiamao
    Hu, Hexuan
    MACHINES, 2024, 12 (11)
  • [22] ROBUST AND ACCURATE OBJECT DETECTION VIA SELF-KNOWLEDGE DISTILLATION
    Xu, Weipeng
    Chu, Pengzhi
    Xie, Renhao
    Xiao, Xiongziyan
    Huang, Hongcheng
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 91 - 95
  • [23] SELF-KNOWLEDGE DISTILLATION VIA FEATURE ENHANCEMENT FOR SPEAKER VERIFICATION
    Liu, Bei
    Wang, Haoyu
    Chen, Zhengyang
    Wang, Shuai
    Qian, Yanmin
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 7542 - 7546
  • [24] MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition
    Yang, Chuanguang
    An, Zhulin
    Zhou, Helong
    Cai, Linhang
    Zhi, Xiang
    Wu, Jiwen
    Xu, Yongjun
    Zhang, Qian
    COMPUTER VISION, ECCV 2022, PT XXIV, 2022, 13684 : 534 - 551
  • [25] Personalized Edge Intelligence via Federated Self-Knowledge Distillation
    Jin, Hai
    Bai, Dongshan
    Yao, Dezhong
    Dai, Yutong
    Gu, Lin
    Yu, Chen
    Sun, Lichao
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2023, 34 (02) : 567 - 580
  • [26] Self-Knowledge Distillation for First Trimester Ultrasound Saliency Prediction
    Gridach, Mourad
    Savochkina, Elizaveta
    Drukker, Lior
    Papageorghiou, Aris T.
    Noble, J. Alison
    SIMPLIFYING MEDICAL ULTRASOUND, ASMUS 2022, 2022, 13565 : 117 - 127
  • [27] Automatic Diabetic Retinopathy Grading via Self-Knowledge Distillation
    Luo, Ling
    Xue, Dingyu
    Feng, Xinglong
    ELECTRONICS, 2020, 9 (09) : 1 - 13
  • [28] Decoupled Feature and Self-Knowledge Distillation for Speech Emotion Recognition
    Yu, Haixiang
    Ning, Yuan
    IEEE ACCESS, 2025, 13 : 33275 - 33285
  • [29] Teaching Yourself: A Self-Knowledge Distillation Approach to Action Recognition
    Duc-Quang Vu
    Le, Ngan
    Wang, Jia-Ching
    IEEE ACCESS, 2021, 9 : 105711 - 105723
  • [30] Two-Stage Approach for Targeted Knowledge Transfer in Self-Knowledge Distillation
    Yin, Zimo
    Pu, Jian
    Zhou, Yijie
    Xue, Xiangyang
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2024, 11 (11) : 2270 - 2283