A One-step Pruning-recovery Framework for Acceleration of Convolutional Neural Networks

被引:0
作者
Wang, Dong [1 ]
Bai, Xiao [1 ]
Zhou, Lei [1 ]
Zhou, Jun [2 ]
机构
[1] Beihang Univ, Sch Comp Sci & Engn, Beijing Adv Innovat Ctr Big Data & Brain Comp, Jiangxi Res Inst, Beijing, Peoples R China
[2] Griffith Univ, Sch Informat & Commun Technol, Nathan, Qld, Australia
来源
2019 IEEE 31ST INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2019) | 2019年
基金
中国国家自然科学基金;
关键词
filter pruning; network pruning; cnn acceleration;
D O I
10.1109/ICTAI.2019.00111
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Acceleration of convolutional neural network has received increasing attention during the past several years. Among various acceleration techniques, filter pruning has its inherent merit by effectively reducing the number of convolution filters. However, most filter pruning methods resort to tedious and time-consuming layer-by-layer pruning-recovery strategy to avoid a significant drop of accuracy. In this paper, we present an efficient filter pruning framework to solve this problem. Our method accelerates the network in one-step pruning-recovery manner with a novel optimization objective function, which achieves higher accuracy with much less cost compared with existing pruning methods. Furthermore, our method allows network compression with global filter pruning. Given a global pruning rate, it can adaptively determine the pruning rate for each single convolutional layer, while these rates are often set as hyper-parameters in previous approaches. Evaluated on VGG-16 and RcsNct-50 using ImageNet, our approach outperforms several state-of-the-art methods with less accuracy drop under the same and even much fewer floating-point operations (FLOPs).
引用
收藏
页码:768 / 775
页数:8
相关论文
共 45 条
  • [31] Batch-Normalization-based Soft Filter Pruning for Deep Convolutional Neural Networks
    Xu, Xiaozhou
    Chen, Qiming
    Xie, Lei
    Su, Hongye
    16TH IEEE INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV 2020), 2020, : 951 - 956
  • [32] A novel and efficient model pruning method for deep convolutional neural networks by evaluating the direct and indirect effects of filters
    Zheng, Yongbin
    Sun, Peng
    Ren, Qian
    Xu, Wanying
    Zhu, Di
    NEUROCOMPUTING, 2024, 569
  • [33] Filter pruning by image channel reduction in pre-trained convolutional neural networks
    Gi Su Chung
    Chee Sun Won
    Multimedia Tools and Applications, 2021, 80 : 30817 - 30826
  • [34] Filter pruning by image channel reduction in pre-trained convolutional neural networks
    Chung, Gi Su
    Won, Chee Sun
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (20) : 30817 - 30826
  • [35] Joint filter and channel pruning of convolutional neural networks as a bi-level optimization problem
    Hassen Louati
    Ali Louati
    Slim Bechikh
    Elham Kariri
    Memetic Computing, 2024, 16 : 71 - 90
  • [36] Joint filter and channel pruning of convolutional neural networks as a bi-level optimization problem
    Louati, Hassen
    Louati, Ali
    Bechikh, Slim
    Kariri, Elham
    MEMETIC COMPUTING, 2024, 16 (01) : 71 - 90
  • [37] Multi-objective evolutionary architectural pruning of deep convolutional neural networks with weights inheritance
    Chung, K. T.
    Lee, C. K. M.
    Tsang, Y. P.
    Wu, C. H.
    Asadipour, Ali
    INFORMATION SCIENCES, 2024, 685
  • [38] PSE-Net: Channel pruning for Convolutional Neural Networks with parallel-subnets estimator
    Wang, Shiguang
    Xie, Tao
    Liu, Haijun
    Zhang, Xingcheng
    Cheng, Jian
    NEURAL NETWORKS, 2024, 174
  • [39] Adding Before Pruning: Sparse Filter Fusion for Deep Convolutional Neural Networks via Auxiliary Attention
    Tian, Guanzhong
    Sun, Yiran
    Liu, Yuang
    Zeng, Xianfang
    Wang, Mengmeng
    Liu, Yong
    Zhang, Jiangning
    Chen, Jun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021,
  • [40] Learning sparse convolutional neural networks through filter pruning for efficient fault diagnosis on edge devices
    Xu, Gaowei
    Zhao, Yukai
    Liu, Min
    NONDESTRUCTIVE TESTING AND EVALUATION, 2025,