Towards efficient filter pruning via topology

被引:0
作者
Xiaozhou Xu
Jun Chen
Hongye Su
Lei Xie
机构
[1] Zhejiang University,State Key Laboratory of Industrial Control Technology, College of Control Science and Engineering
来源
Journal of Real-Time Image Processing | 2022年 / 19卷
关键词
Model compression; Filter pruning; Neural networks; Image classification;
D O I
暂无
中图分类号
学科分类号
摘要
With the development of deep neural networks, compressing and accelerating deep neural networks without performance deterioration has become a research hotspot. Among all kinds of network compression methods, network pruning is one of the most effective and popular methods. Inspired by several property-based pruning methods and geometric topology, we focus the research of the pruning method on the extraction of feature map information. We predefine a metric, called TopologyHole, used to describe the feature map and associate it with the importance of the corresponding filter. In the exploration experiments, we find out that the average TopologyHole of the feature map for the same filter is relatively stable, regardless of the number of image batches the CNNs receive. This phenomenon proves TopologyHole is a data-independent metric and valid as a criterion for filter pruning. Through a large number of experiments, we have demonstrated that priorly pruning the filters with high-TopologyHole feature maps achieves competitive performance compared to the state-of-the-art. Notably, on ImageNet, TopologyHole reduces 45.0%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document} FLOPs by removing 40.9%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document} parameters on ResNet-50 with 75.71%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document}, only a loss of 0.44%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document} in top-1 accuracy.
引用
收藏
页码:639 / 649
页数:10
相关论文
共 45 条
[1]  
Krizhevsky A(2017)ImageNet classification with deep convolutional neural networks Commun. ACM 60 84-90
[2]  
Sutskever I(2015)Learning both weights and connections for efficient neural network Adv. Neural Inf. Process. Syst. 28 1135-1143
[3]  
Hinton GE(2020)A learning framework for n-bit quantized neural networks toward FPGAs IEEE Trans. Neural Netw. Learn. Syst. 32 1067-1081
[4]  
Han S(2021)A compression pipeline for one-stage object detection model J. Real Time Image Process. 18 1949-1962
[5]  
Pool J(2021)Gradient information distillation network for real-time single-image super-resolution J. Real Time Image Process. 18 333-344
[6]  
Tran J(2016)Computationally efficient image deblurring using low rank image approximation and its GPU implementation J. Real Time Image Process. 12 567-573
[7]  
Dally W(2015)ImageNet large scale visual recognition challenge Int. J. Comput. Vis. 115 211-252
[8]  
Chen J(2021)Pruning by training: a novel deep neural network compression framework for image processing IEEE Signal Process. Lett. 28 344-348
[9]  
Liu L(2021)Filter pruning via separation of sparsity search and model training Neurocomputing 462 185-194
[10]  
Liu Y(2020)Autopruner: an end-to-end trainable filter pruning method for efficient deep model inference Pattern Recognit. 107 107461-undefined