An optimal-score-based filter pruning for deep convolutional neural networks

被引:0
作者
Shrutika S. Sawant
J. Bauer
F. X. Erick
Subodh Ingaleshwar
N. Holzer
A. Ramming
E. W. Lang
Th. Götz
机构
[1] Fraunhofer IIS,Department of Internal Medicine 3, Rheumatology & Immunology
[2] Fraunhofer Institute for Integrated Circuits IIS,CIML Group, Biophysics
[3] JSS Academy of Technical Education,undefined
[4] Friedrich-Alexander-University (FAU) Erlangen-Nürnberg and University Hospital Erlangen,undefined
[5] University of Regensburg,undefined
来源
Applied Intelligence | 2022年 / 52卷
关键词
CNN; Deep learning; Filter pruning; Image segmentation; Model compression; Redundancy;
D O I
暂无
中图分类号
学科分类号
摘要
Convolutional Neural Networks (CNN) have achieved excellent performance in the processing of high-resolution images. Most of these networks contain many deep layers in quest of greater segmentation performance. However, over-sized CNN models result in overwhelming memory usage and large inference costs. Earlier studies have revealed that over-sized deep neural models tend to deal with abundant redundant filters that are very similar and provide tiny or no contribution in accelerating the inference of the model. Therefore, we have proposed a novel optimal-score-based filter pruning (OSFP) approach to prune redundant filters according to their relative similarity in feature space. OSFP not only speeds up learning in the network but also eradicates redundant filters leading to improvement in the segmentation performance. We empirically demonstrate on widely used segmentation network models (TernausNet, classical U-Net and VGG16 U-Net) and benchmark datasets (Inria Aerial Image Labeling Dataset and Aerial Imagery for Roof Segmentation (AIRS)) that computation costs (in terms of Float Point Operations (FLOPs) and parameters) are reduced significantly, while maintaining or even improving accuracy.
引用
收藏
页码:17557 / 17579
页数:22
相关论文
共 48 条
  • [11] Kumar A(2020)Toward compact ConvNets via structure-sparsity regularized filter pruning IEEE Trans Neural Netw Learn Syst 31 574-588
  • [12] le C(2021)Compressing CNNs using multi-level filter pruning for the edge nodes of multimedia internet of things IEEE Internet Things J 4662 1-11
  • [13] Lee S(2019)Efficient convolution neural networks for object tracking using separable convolution and filter pruning IEEE Access 7 106466-106474
  • [14] Heo B(2020)Human segmentation based on compressed deep convolutional neural network IEEE Access 8 167585-167595
  • [15] Ha JW(2019)Studying the plasticity in deep convolutional neural networks using random pruning Mach Vis Appl 30 203-216
  • [16] Song BC(2021)Deep network compression with teacher latent subspace learning and LASSO Appl Intell 51 834-853
  • [17] Liang Y(2018)A novel pruning algorithm for smoothing feedforward neural networks based on group Lasso method IEEE Trans Neural Netw Learn Syst 29 2012-2024
  • [18] Lianqiang LI(2017)Representative band selection for hyperspectral image classification J Vis Commun Image Represent 48 396-403
  • [19] Zhu J(2021)Real-time segmentation method of lightweight network for finger vein using embedded terminal technique IEEE Access 9 303-316
  • [20] Ming-Ting SUN(2020)Evolutionary compression of deep neural networks for biomedical image segmentation IEEE Trans Neural Netw Learn Syst 31 2916-2929