SFPBL: Soft Filter Pruning Based on Logistic Growth Differential Equation for Neural Network

被引:0
作者
Hu, Can [1 ]
Zhang, Shanqing [2 ]
Tao, Kewei [2 ]
Yang, Gaoming [1 ]
Li, Li [2 ]
机构
[1] Hangzhou Dianzi Univ, HDU ITMO Joint Inst, Hangzhou 310018, Peoples R China
[2] Hangzhou Dianzi Univ, Sch Comp Sci & Technol, Hangzhou 310018, Peoples R China
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2025年 / 82卷 / 03期
基金
中国国家自然科学基金;
关键词
Filter pruning; channel pruning; CNN complexity; deep neural networks; filtering theory; logistic model;
D O I
10.32604/cmc.2025.059770
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The surge of large-scale models in recent years has led to breakthroughs in numerous fields, but it has also introduced higher computational costs and more complex network architectures. These increasingly large and intricate networks pose challenges for deployment and execution while also exacerbating the issue of network over- parameterization. To address this issue, various network compression techniques have been developed, such as network pruning. A typical pruning algorithm follows a three-step pipeline involving training, pruning, and retraining. Existing methods often directly set the pruned filters to zero during retraining, significantly reducing the parameter space. However, this direct pruning strategy frequently results in irreversible information loss. In the early stages of training, a network still contains much uncertainty, and evaluating filter importance may not be sufficiently rigorous. To manage the pruning process effectively, this paper proposes a flexible neural network pruning algorithm based on the logistic growth differential equation, considering the characteristics of network training. Unlike other pruning algorithms that directly reduce filter weights, this algorithm introduces a three-stage adaptive weight decay strategy inspired by the logistic growth differential equation. It employs a gentle decay rate in the initial training stage, a rapid decay rate during the intermediate stage, and a slower decay rate in the network convergence stage. Additionally, the decay rate is adjusted adaptively based on the filter weights at each stage. By controlling the adaptive decay rate at each stage, the pruning of neural network filters can be effectively managed. In experiments conducted on the CIFAR-10 and ILSVRC2012 datasets, the pruning of neural networks significantly reduces the floating-point operations while maintaining the same pruning rate. Specifically, when implementing a 30% pruning rate on the ResNet-110 network, the pruned neural network not only decreases floating-point operations by 40.8% but also enhances the classification accuracy by 0.49% compared to the original network.
引用
收藏
页码:4913 / 4930
页数:18
相关论文
共 26 条
[1]   Softer Pruning, Incremental Regularization [J].
Cai, Linhang ;
An, Zhulin ;
Yang, Chuanguang ;
Xu, Yongjun .
2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, :224-230
[2]   Semantic Prompt for Few-Shot Image Recognition [J].
Chen, Wentao ;
Si, Chenyang ;
Zhang, Zhang ;
Wang, Liang ;
Wang, Zilei ;
Tan, Tieniu .
2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, :23581-23591
[3]  
Davoodi Pooya, 2019, GPU TECHN C
[4]   More is Less: A More Complicated Network with Less Inference Complexity [J].
Dong, Xuanyi ;
Huang, Junshi ;
Yang, Yi ;
Yan, Shuicheng .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :1895-1903
[5]   DepGraph: Towards Any Structural Pruning [J].
Fang, Gongfan ;
Ma, Xinyin ;
Song, Mingli ;
Mi, Michael Bi ;
Wang, Xinchao .
2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, :16091-16101
[6]  
Gholami A., 2021, Low -Power Computer Vision, DOI [10.1201/9781003162810-13, DOI 10.1201/9781003162810-13]
[7]   Fast R-CNN [J].
Girshick, Ross .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, :1440-1448
[8]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778
[9]   Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration [J].
He, Yang ;
Liu, Ping ;
Wang, Ziwei ;
Hu, Zhilan ;
Yang, Yi .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :4335-4344
[10]   Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks [J].
He, Yang ;
Dong, Xuanyi ;
Kang, Guoliang ;
Fu, Yanwei ;
Yan, Chenggang ;
Yang, Yi .
IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (08) :3594-3604