Using Feature Entropy to Guide Filter Pruning for Efficient Convolutional Networks

被引:9
|
作者
Li, Yun [1 ]
Wang, Luyang [1 ]
Peng, Sifan [1 ]
Kumar, Aakash [1 ]
Yin, Baoqun [1 ]
机构
[1] Univ Sci & Technol China, Dept Automat, Hefei, Peoples R China
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II | 2019年 / 11728卷
关键词
Convolutional neural networks; Filter pruning; Entropy; Features selection module;
D O I
10.1007/978-3-030-30484-3_22
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The rapid development of convolutional neural networks (CNNs) is usually accompanied by an increase in model volume and computational cost. In this paper, we propose an entropy-based filter pruning (EFP) method to learn more efficient CNNs. Different from many existing filter pruning approaches, our proposed method prunes unimportant filters based on the amount of information carried by their corresponding feature maps. We employ entropy to measure the information contained in the feature maps and design features selection module to formulate pruning strategies. Pruning and fine-tuning are iterated several times, yielding thin and more compact models with comparable accuracy. We empirically demonstrate the effectiveness of our method with many advanced CNNs on several benchmark datasets. Notably, for VGG-16 on CIFAR-10, our EFP method prunes 92.9% parameters and reduces 76% float-point-operations (FLOPs) without accuracy loss, which has advanced the state-of-the-art.
引用
收藏
页码:263 / 274
页数:12
相关论文
共 50 条
  • [1] Filter Pruning for Efficient Transfer Learning in Deep Convolutional Neural Networks
    Reinhold, Caique
    Roisenberg, Mauro
    ARTIFICIAL INTELLIGENCEAND SOFT COMPUTING, PT I, 2019, 11508 : 191 - 202
  • [2] Filter pruning by quantifying feature similarity and entropy of feature maps
    Liu, Yajun
    Fan, Kefeng
    Wu, Dakui
    Zhou, Wenju
    NEUROCOMPUTING, 2023, 544
  • [3] Filter pruning with a feature map entropy importance criterion for convolution neural networks compressing
    Wang, Jielei
    Jiang, Ting
    Cui, Zongyong
    Cao, Zongjie
    NEUROCOMPUTING, 2021, 461 : 41 - 54
  • [4] Acceleration of Deep Convolutional Neural Networks Using Adaptive Filter Pruning
    Singh, Pravendra
    Verma, Vinay Kumar
    Rai, Piyush
    Namboodiri, Vinay P.
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2020, 14 (04) : 838 - 847
  • [5] FPC: Filter pruning via the contribution of output feature map for deep convolutional neural networks acceleration
    Chen, Yanming
    Wen, Xiang
    Zhang, Yiwen
    He, Qiang
    KNOWLEDGE-BASED SYSTEMS, 2022, 238
  • [6] Entropy-based pruning method for convolutional neural networks
    Hur, Cheonghwan
    Kang, Sanggil
    JOURNAL OF SUPERCOMPUTING, 2019, 75 (06): : 2950 - 2963
  • [7] Entropy-based pruning method for convolutional neural networks
    Cheonghwan Hur
    Sanggil Kang
    The Journal of Supercomputing, 2019, 75 : 2950 - 2963
  • [8] FILTER PRUNING BASED ON LOCAL GRADIENT ACTIVATION MAPPING IN CONVOLUTIONAL NEURAL NETWORKS
    Intraraprasit, Monthon
    Chitsobhuk, Orachat
    INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2023, 19 (06): : 1697 - 1715
  • [9] Pruning convolutional neural networks via filter similarity analysis
    Lili Geng
    Baoning Niu
    Machine Learning, 2022, 111 : 3161 - 3180
  • [10] Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks
    He, Yang
    Dong, Xuanyi
    Kang, Guoliang
    Fu, Yanwei
    Yan, Chenggang
    Yang, Yi
    IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (08) : 3594 - 3604