Using Feature Entropy to Guide Filter Pruning for Efficient Convolutional Networks

被引:9
|
作者
Li, Yun [1 ]
Wang, Luyang [1 ]
Peng, Sifan [1 ]
Kumar, Aakash [1 ]
Yin, Baoqun [1 ]
机构
[1] Univ Sci & Technol China, Dept Automat, Hefei, Peoples R China
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II | 2019年 / 11728卷
关键词
Convolutional neural networks; Filter pruning; Entropy; Features selection module;
D O I
10.1007/978-3-030-30484-3_22
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The rapid development of convolutional neural networks (CNNs) is usually accompanied by an increase in model volume and computational cost. In this paper, we propose an entropy-based filter pruning (EFP) method to learn more efficient CNNs. Different from many existing filter pruning approaches, our proposed method prunes unimportant filters based on the amount of information carried by their corresponding feature maps. We employ entropy to measure the information contained in the feature maps and design features selection module to formulate pruning strategies. Pruning and fine-tuning are iterated several times, yielding thin and more compact models with comparable accuracy. We empirically demonstrate the effectiveness of our method with many advanced CNNs on several benchmark datasets. Notably, for VGG-16 on CIFAR-10, our EFP method prunes 92.9% parameters and reduces 76% float-point-operations (FLOPs) without accuracy loss, which has advanced the state-of-the-art.
引用
收藏
页码:263 / 274
页数:12
相关论文
共 50 条
  • [21] An optimal-score-based filter pruning for deep convolutional neural networks
    Shrutika S. Sawant
    J. Bauer
    F. X. Erick
    Subodh Ingaleshwar
    N. Holzer
    A. Ramming
    E. W. Lang
    Th. Götz
    Applied Intelligence, 2022, 52 : 17557 - 17579
  • [22] Filter pruning via annealing decaying for deep convolutional neural networks acceleration
    Huang, Jiawen
    Xiong, Liyan
    Huang, Xiaohui
    Chen, Qingsen
    Huang, Peng
    CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2025, 28 (02):
  • [23] A Dual Rank-Constrained Filter Pruning Approach for Convolutional Neural Networks
    Fan, Fugui
    Su, Yuting
    Jing, Peiguang
    Lu, Wei
    IEEE SIGNAL PROCESSING LETTERS, 2021, 28 (28) : 1734 - 1738
  • [24] Hardware-Aware Evolutionary Explainable Filter Pruning for Convolutional Neural Networks
    Heidorn, Christian
    Sabih, Muhammad
    Meyerhoefer, Nicolai
    Schinabeck, Christian
    Teich, Juergen
    Hannig, Frank
    INTERNATIONAL JOURNAL OF PARALLEL PROGRAMMING, 2024, 52 (1-2) : 40 - 58
  • [25] Zero-Keep Filter Pruning for Energy/Power Efficient Deep Neural Networks
    Woo, Yunhee
    Kim, Dongyoung
    Jeong, Jaemin
    Ko, Young-Woong
    Lee, Jeong-Gun
    ELECTRONICS, 2021, 10 (11)
  • [26] FP-AGL: Filter Pruning With Adaptive Gradient Learning for Accelerating Deep Convolutional Neural Networks
    Kim, Nam Joon
    Kim, Hyun
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 5279 - 5290
  • [27] Batch Entropy Supervised Convolutional Neural Networks for Feature Extraction and Harmonizing for Action Recognition
    Hossain, Md Imtiaz
    Siddique, Ashraf
    Hossain, Md Alamgir
    Hossain, Md Delowar
    Huh, Eui-Nam
    IEEE ACCESS, 2020, 8 : 206427 - 206444
  • [28] Filter Pruning Without Damaging Networks Capacity
    Zuo, Yuding
    Chen, Bo
    Shi, Te
    Sun, Mengfan
    IEEE ACCESS, 2020, 8 (90924-90930) : 90924 - 90930
  • [29] CAPTOR: A Class Adaptive Filter Pruning Framework for Convolutional Neural Networks in Mobile Applications
    Qin, Zhuwei
    Yu, Fuxun
    Liu, Chenchen
    Chen, Xiang
    24TH ASIA AND SOUTH PACIFIC DESIGN AUTOMATION CONFERENCE (ASP-DAC 2019), 2019, : 444 - 449
  • [30] Filter Pruning Using Expectation Value of Feature Map's Summation
    Wu, Hai
    Liu, Chuanbin
    Lin, Fanchao
    Liu, Yizhi
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2021, PT IV, 2021, 13016 : 748 - 755