Using Feature Entropy to Guide Filter Pruning for Efficient Convolutional Networks

被引:9
|
作者
Li, Yun [1 ]
Wang, Luyang [1 ]
Peng, Sifan [1 ]
Kumar, Aakash [1 ]
Yin, Baoqun [1 ]
机构
[1] Univ Sci & Technol China, Dept Automat, Hefei, Peoples R China
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II | 2019年 / 11728卷
关键词
Convolutional neural networks; Filter pruning; Entropy; Features selection module;
D O I
10.1007/978-3-030-30484-3_22
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The rapid development of convolutional neural networks (CNNs) is usually accompanied by an increase in model volume and computational cost. In this paper, we propose an entropy-based filter pruning (EFP) method to learn more efficient CNNs. Different from many existing filter pruning approaches, our proposed method prunes unimportant filters based on the amount of information carried by their corresponding feature maps. We employ entropy to measure the information contained in the feature maps and design features selection module to formulate pruning strategies. Pruning and fine-tuning are iterated several times, yielding thin and more compact models with comparable accuracy. We empirically demonstrate the effectiveness of our method with many advanced CNNs on several benchmark datasets. Notably, for VGG-16 on CIFAR-10, our EFP method prunes 92.9% parameters and reduces 76% float-point-operations (FLOPs) without accuracy loss, which has advanced the state-of-the-art.
引用
收藏
页码:263 / 274
页数:12
相关论文
共 50 条
  • [31] Filter pruning via feature map clustering
    Li, Wei
    He, Yongxing
    Zhang, Xiaoyu
    Tang, Yongchuan
    INTELLIGENT DATA ANALYSIS, 2023, 27 (04) : 911 - 933
  • [32] A Simple and Effective Convolutional Filter Pruning based on Filter Dissimilarity Analysis
    Erick, F. X.
    Sawant, Shrutika S.
    Goeb, Stephan
    Holzer, N.
    Lang, E. W.
    Goetz, Th
    ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 3, 2022, : 139 - 145
  • [33] Fpar: filter pruning via attention and rank enhancement for deep convolutional neural networks acceleration
    Chen, Yanming
    Wu, Gang
    Shuai, Mingrui
    Lou, Shubin
    Zhang, Yiwen
    An, Zhulin
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (07) : 2973 - 2985
  • [34] Joint filter and channel pruning of convolutional neural networks as a bi-level optimization problem
    Hassen Louati
    Ali Louati
    Slim Bechikh
    Elham Kariri
    Memetic Computing, 2024, 16 : 71 - 90
  • [35] Joint filter and channel pruning of convolutional neural networks as a bi-level optimization problem
    Louati, Hassen
    Louati, Ali
    Bechikh, Slim
    Kariri, Elham
    MEMETIC COMPUTING, 2024, 16 (01) : 71 - 90
  • [36] Compressing Convolutional Neural Networks by Pruning Density Peak Filters
    Jang, Yunseok
    Lee, Sangyoun
    Kim, Jaeseok
    IEEE ACCESS, 2021, 9 : 8278 - 8285
  • [37] GATE TRIMMING: ONE-SHOT CHANNEL PRUNING FOR EFFICIENT CONVOLUTIONAL NEURAL NETWORKS
    Yu, Fang
    Han, Chuanqi
    Wang, Pengcheng
    Huang, Xi
    Cui, Li
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 1365 - 1369
  • [38] Data-Efficient Adaptive Global Pruning for Convolutional Neural Networks in Edge Computing
    Gao, Zhipeng
    Sun, Shan
    Mo, Zijia
    Rui, Lanlan
    Yang, Yang
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 6633 - 6638
  • [39] Filter Pruning with Convolutional Approximation Small Model Framework
    Intraraprasit, Monthon
    Chitsobhuk, Orachat
    COMPUTATION, 2023, 11 (09)
  • [40] Feature learning for steganalysis using convolutional neural networks
    Qian, Yinlong
    Dong, Jing
    Wang, Wei
    Tan, Tieniu
    MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (15) : 19633 - 19657