Progressive multi-level distillation learning for pruning network

被引:0
|
作者
Ruiqing Wang
Shengmin Wan
Wu Zhang
Chenlu Zhang
Yu Li
Shaoxiang Xu
Lifu Zhang
Xiu Jin
Zhaohui Jiang
Yuan Rao
机构
[1] Anhui Agricultural University,School of Information and Computer
[2] Anhui Agriculture University,Anhui Province Key Laboratory of Smart Agricultural Technology and Equipment
来源
关键词
Deep neural network; Model compression; Network pruning; Knowledge distillation;
D O I
暂无
中图分类号
学科分类号
摘要
Although the classification method based on the deep neural network has achieved excellent results in classification tasks, it is difficult to apply to real-time scenarios because of high memory footprints and prohibitive inference times. Compared to unstructured pruning, structured pruning techniques can reduce the computation cost of the model runtime more effectively, but inevitably reduces the precision of the model. Traditional methods use fine tuning to restore model damage performance. However, there is still a large gap between the pruned model and the original one. In this paper, we use progressive multi-level distillation learning to compensate for the loss caused by pruning. Pre-pruning and post-pruning networks serve as the teacher and student networks. The proposed approach utilizes the complementary properties of structured pruning and knowledge distillation, which allows the pruned network to learn the intermediate and output representations of the teacher network, thus reducing the influence of the model subject to pruning. Experiments demonstrate that our approach performs better on CIFAR-10, CIFAR-100, and Tiny-ImageNet datasets with different pruning rates. For instance, GoogLeNet can achieve near lossless pruning on the CIFAR-10 dataset with 60% pruning. Moreover, this paper also proves that using the proposed distillation learning method during the pruning process achieves more significant performance gains than after completing the pruning.
引用
收藏
页码:5779 / 5791
页数:12
相关论文
共 50 条
  • [31] Development of a multi-level learning framework
    Morland, Kate V.
    Breslin, Dermot
    Stevenson, Fionn
    LEARNING ORGANIZATION, 2019, 26 (01): : 78 - 96
  • [32] Combating Multi-level Adversarial Text with Pruning based Adversarial Training
    Ke, Jianpeng
    Wang, Lina
    Ye, Aoshuang
    Fu, Jie
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [33] LRH-Net: A Multi-level Knowledge Distillation Approach for Low-Resource Heart Network
    Chauhan, Ekansh
    Guptha, Swathi
    Reddy, Likith
    Raju, Bapi
    DISTRIBUTED, COLLABORATIVE, AND FEDERATED LEARNING, AND AFFORDABLE AI AND HEALTHCARE FOR RESOURCE DIVERSE GLOBAL HEALTH, DECAF 2022, FAIR 2022, 2022, 13573 : 190 - 201
  • [34] HYPERSPECTRAL AND MULTISPECTRAL IMAGE FUSION USING A MULTI-LEVEL PROPAGATION LEARNING NETWORK
    Theran, Carlos A.
    Alvarez, Michael A.
    Arzuaga, Emmanuel
    Sierra, Heidy
    2021 11TH WORKSHOP ON HYPERSPECTRAL IMAGING AND SIGNAL PROCESSING: EVOLUTION IN REMOTE SENSING (WHISPERS), 2021,
  • [35] Learning Multi-level Representations for Image Emotion Recognition in the Deep Convolutional Network
    Zhang, Hao
    Liu, Yanan
    Xu, Dan
    He, Kangjian
    Peng, Guoqin
    Yue, Yingying
    Liu, Ruhan
    THIRTEENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING (ICGIP 2021), 2022, 12083
  • [36] Multi-Level Circulation Pattern Classification Based on the Transfer Learning CNN Network
    Liu, Yanzhang
    Cai, Jinqi
    Tan, Guirong
    ATMOSPHERE, 2022, 13 (11)
  • [37] Multi-level social network alignment via adversarial learning and graphlet modeling
    Duan, Jingyuan
    Kang, Zhao
    Tian, Ling
    Xin, Yichen
    NEURAL NETWORKS, 2025, 185
  • [38] Multi-scale and multi-level shape descriptor learning via a hybrid fusion network
    Huang, Xinwei
    Li, Nannan
    Xia, Qing
    Li, Shuai
    Hao, Aimin
    Qin, Hong
    GRAPHICAL MODELS, 2022, 119
  • [39] Full-Scene Defocus Blur Detection With DeFBD plus via Multi-Level Distillation Learning
    Zhao, Wenda
    Wei, Fei
    Wang, Haipeng
    He, You
    Lu, Huchuan
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 9228 - 9240
  • [40] Learning Multi-Level Task Groups in Multi-Task Learning
    Han, Lei
    Zhang, Yu
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2638 - 2644