A One-step Pruning-recovery Framework for Acceleration of Convolutional Neural Networks

被引:0
|
作者
Wang, Dong [1 ]
Bai, Xiao [1 ]
Zhou, Lei [1 ]
Zhou, Jun [2 ]
机构
[1] Beihang Univ, Sch Comp Sci & Engn, Beijing Adv Innovat Ctr Big Data & Brain Comp, Jiangxi Res Inst, Beijing, Peoples R China
[2] Griffith Univ, Sch Informat & Commun Technol, Nathan, Qld, Australia
来源
2019 IEEE 31ST INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2019) | 2019年
基金
中国国家自然科学基金;
关键词
filter pruning; network pruning; cnn acceleration;
D O I
10.1109/ICTAI.2019.00111
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Acceleration of convolutional neural network has received increasing attention during the past several years. Among various acceleration techniques, filter pruning has its inherent merit by effectively reducing the number of convolution filters. However, most filter pruning methods resort to tedious and time-consuming layer-by-layer pruning-recovery strategy to avoid a significant drop of accuracy. In this paper, we present an efficient filter pruning framework to solve this problem. Our method accelerates the network in one-step pruning-recovery manner with a novel optimization objective function, which achieves higher accuracy with much less cost compared with existing pruning methods. Furthermore, our method allows network compression with global filter pruning. Given a global pruning rate, it can adaptively determine the pruning rate for each single convolutional layer, while these rates are often set as hyper-parameters in previous approaches. Evaluated on VGG-16 and RcsNct-50 using ImageNet, our approach outperforms several state-of-the-art methods with less accuracy drop under the same and even much fewer floating-point operations (FLOPs).
引用
收藏
页码:768 / 775
页数:8
相关论文
共 45 条
  • [21] Pruning Filter via Gaussian Distribution Feature for Deep Neural Networks Acceleration
    Xu, Jianrong
    Diao, Boyu
    Cui, Bifeng
    Yang, Kang
    Li, Chao
    Hong, Hailong
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [22] An optimal-score-based filter pruning for deep convolutional neural networks
    Sawant, Shrutika S.
    Bauer, J.
    Erick, F. X.
    Ingaleshwar, Subodh
    Holzer, N.
    Ramming, A.
    Lang, E. W.
    Goetz, Th
    APPLIED INTELLIGENCE, 2022, 52 (15) : 17557 - 17579
  • [23] Hardware-Aware Evolutionary Explainable Filter Pruning for Convolutional Neural Networks
    Christian Heidorn
    Muhammad Sabih
    Nicolai Meyerhöfer
    Christian Schinabeck
    Jürgen Teich
    Frank Hannig
    International Journal of Parallel Programming, 2024, 52 : 40 - 58
  • [24] FILTER PRUNING BASED ON LOCAL GRADIENT ACTIVATION MAPPING IN CONVOLUTIONAL NEURAL NETWORKS
    Intraraprasit, Monthon
    Chitsobhuk, Orachat
    INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2023, 19 (06): : 1697 - 1715
  • [25] Pruning Convolutional Neural Networks with an Attention Mechanism for Remote Sensing Image Classification
    Zhang, Shuo
    Wu, Gengshen
    Gu, Junhua
    Han, Jungong
    ELECTRONICS, 2020, 9 (08) : 1 - 19
  • [26] Hardware-Aware Evolutionary Explainable Filter Pruning for Convolutional Neural Networks
    Heidorn, Christian
    Sabih, Muhammad
    Meyerhoefer, Nicolai
    Schinabeck, Christian
    Teich, Juergen
    Hannig, Frank
    INTERNATIONAL JOURNAL OF PARALLEL PROGRAMMING, 2024, 52 (1-2) : 40 - 58
  • [27] A Low-Complexity Modified ThiNet Algorithm for Pruning Convolutional Neural Networks
    Tofigh, Sadegh
    Ahmad, M. Omair
    Swamy, M. N. S.
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1012 - 1016
  • [28] An optimal-score-based filter pruning for deep convolutional neural networks
    Shrutika S. Sawant
    J. Bauer
    F. X. Erick
    Subodh Ingaleshwar
    N. Holzer
    A. Ramming
    E. W. Lang
    Th. Götz
    Applied Intelligence, 2022, 52 : 17557 - 17579
  • [29] A Dual Rank-Constrained Filter Pruning Approach for Convolutional Neural Networks
    Fan, Fugui
    Su, Yuting
    Jing, Peiguang
    Lu, Wei
    IEEE SIGNAL PROCESSING LETTERS, 2021, 28 (28) : 1734 - 1738
  • [30] RFPruning: A retraining-free pruning method for accelerating convolutional neural networks
    Wang, Zhenyu
    Xie, Xuemei
    Shi, Guangming
    APPLIED SOFT COMPUTING, 2021, 113