Structured feature sparsity training for convolutional neural network compression

被引:8
|
作者
Wang, Wei [1 ,2 ]
Zhu, Liqiang [1 ,2 ]
机构
[1] Beijing Jiaotong Univ, Sch Mech Elect & Control Engn, Beijing 100044, Peoples R China
[2] Beijing Jiaotong Univ, Key Lab Vehicle Adv Mfg Measuring & Control Techn, Minist Educ, Beijing 100044, Peoples R China
关键词
Convolutional neural network; CNN compression; Structured sparsity; Pruning criterion;
D O I
10.1016/j.jvcir.2020.102867
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Convolutional neural networks (CNNs) with large model size and computing operations are difficult to be deployed on embedded systems, such as smartphones or AI cameras. In this paper, we propose a novel structured pruning method, termed the structured feature sparsity training (SFST), to speed up the inference process and reduce the memory usage of CNNs. Unlike other existing pruning methods, which require multiple iterations of pruning and retraining to ensure stable performance, SFST only needs to fine-tune the pretrained model with additional regularization on the less important features and then prune them, no multiple pruning and retraining needed. SFST can be deployed to a variety of modern CNN architectures including VGGNet, ResNet and MobileNetv2. Experimental results on CIFAR, SVHN, ImageNet and MSTAR benchmark dataset demonstrate the effectiveness of our scheme, which achieves superior performance over the state-of-the-art methods.
引用
收藏
页数:8
相关论文
共 50 条
  • [21] Structured Sparsity of Convolutional Neural Networks via Nonconvex Sparse Group Regularization
    Bui, Kevin
    Park, Fredrick
    Zhang, Shuai
    Qi, Yingyong
    Xin, Jack
    FRONTIERS IN APPLIED MATHEMATICS AND STATISTICS, 2021, 6
  • [22] Building segmentation through a gated graph convolutional neural network with deep structured feature embedding
    Shi, Yilei
    Li, Qingyu
    Zhu, Xiao Xiang
    ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2020, 159 : 184 - 197
  • [23] SeerNet: Predicting Convolutional Neural Network Feature-Map Sparsity through Low-Bit Quantization
    Cao, Shijie
    Ma, Lingxiao
    Xiao, Wencong
    Zhang, Chen
    Liu, Yunxin
    Zhang, Lintao
    Nie, Lanshun
    Yang, Zhi
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 11208 - 11217
  • [24] SparseTrain: Exploiting Dataflow Sparsity for Efficient Convolutional Neural Networks Training
    Dai, Pengcheng
    Yang, Jianlei
    Ye, Xucheng
    Cheng, Xingzhou
    Luo, Junyu
    Song, Linghao
    Chen, Yiran
    Zhao, Weisheng
    PROCEEDINGS OF THE 2020 57TH ACM/EDAC/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2020,
  • [25] Fused feature encoding in convolutional neural network
    Lu Huo
    Tianrong Rao
    Leijie Zhang
    Multimedia Tools and Applications, 2019, 78 : 1635 - 1648
  • [26] Fused feature encoding in convolutional neural network
    Huo, Lu
    Rao, Tianrong
    Zhang, Leijie
    MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (02) : 1635 - 1648
  • [27] ON THE ROLE OF STRUCTURED PRUNING FOR NEURAL NETWORK COMPRESSION
    Bragagnolo, Andrea
    Tartaglione, Enzo
    Fiandrotti, Attilio
    Grangetto, Marco
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 3527 - 3531
  • [28] Area Efficient Compression for Floating-Point Feature Maps in Convolutional Neural Network Accelerators
    Yan, Bai-Kui
    Ruan, Shanq-Jang
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2023, 70 (02) : 746 - 750
  • [29] FEATURE SPARSITY IN CONVOLUTIONAL NEURAL NETWORKS FOR SCENE CLASSIFICATION OF REMOTE SENSING IMAGE
    Huang, Wei
    Wang, Qi
    Li, Xuelong
    2019 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2019), 2019, : 3017 - 3020
  • [30] Sparsity Through Spiking Convolutional Neural Network for Audio Classification at the Edge
    Leow, Cong Sheng
    Goh, Wang Ling
    Gao, Yuan
    2023 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS, 2023,