Structured feature sparsity training for convolutional neural network compression

被引:8
|
作者
Wang, Wei [1 ,2 ]
Zhu, Liqiang [1 ,2 ]
机构
[1] Beijing Jiaotong Univ, Sch Mech Elect & Control Engn, Beijing 100044, Peoples R China
[2] Beijing Jiaotong Univ, Key Lab Vehicle Adv Mfg Measuring & Control Techn, Minist Educ, Beijing 100044, Peoples R China
关键词
Convolutional neural network; CNN compression; Structured sparsity; Pruning criterion;
D O I
10.1016/j.jvcir.2020.102867
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Convolutional neural networks (CNNs) with large model size and computing operations are difficult to be deployed on embedded systems, such as smartphones or AI cameras. In this paper, we propose a novel structured pruning method, termed the structured feature sparsity training (SFST), to speed up the inference process and reduce the memory usage of CNNs. Unlike other existing pruning methods, which require multiple iterations of pruning and retraining to ensure stable performance, SFST only needs to fine-tune the pretrained model with additional regularization on the less important features and then prune them, no multiple pruning and retraining needed. SFST can be deployed to a variety of modern CNN architectures including VGGNet, ResNet and MobileNetv2. Experimental results on CIFAR, SVHN, ImageNet and MSTAR benchmark dataset demonstrate the effectiveness of our scheme, which achieves superior performance over the state-of-the-art methods.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] DTS: dynamic training slimming with feature sparsity for efficient convolutional neural network
    Yin, Jia
    Wang, Wei
    Guo, Zhonghua
    Ji, Yangchun
    JOURNAL OF REAL-TIME IMAGE PROCESSING, 2024, 21 (04)
  • [2] Dynamic sparsity and model feature learning enhanced training for convolutional neural network-pruning
    Ruan X.
    Hu W.
    Liu Y.
    Li B.
    Zhongguo Kexue Jishu Kexue/Scientia Sinica Technologica, 2022, 52 (05): : 667 - 681
  • [3] Structured pruning via feature channels similarity and mutual learning for convolutional neural network compression
    Wei Yang
    Yancai Xiao
    Applied Intelligence, 2022, 52 : 14560 - 14570
  • [4] Structured pruning via feature channels similarity and mutual learning for convolutional neural network compression
    Yang, Wei
    Xiao, Yancai
    APPLIED INTELLIGENCE, 2022, 52 (12) : 14560 - 14570
  • [5] Lassonet: A neural network with feature sparsity
    Lemhadri, Ismael
    Ruan, Feng
    Abraham, Louis
    Tibshirani, Robert
    Journal of Machine Learning Research, 2021, 22
  • [6] LassoNet: A Neural Network with Feature Sparsity
    Lemhadri, Ismael
    Ruan, Feng
    Abraham, Louis
    Tibshirani, Robert
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22
  • [7] Visualization of Feature Evolution During Convolutional Neural Network Training
    Punjabi, Arjun
    Katsaggelos, Aggelos K.
    2017 25TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2017, : 311 - 315
  • [8] Feature Mining: A Novel Training Strategy for Convolutional Neural Network
    Xie, Tianshu
    Deng, Jiali
    Cheng, Xuan
    Liu, Minghui
    Wang, Xiaomin
    Liu, Ming
    APPLIED SCIENCES-BASEL, 2022, 12 (07):
  • [9] A Feature Map Lossless Compression Framework for Convolutional Neural Network Accelerators
    Zhang, Zekun
    Jiao, Xin
    Xu, Chengyu
    2024 IEEE 6TH INTERNATIONAL CONFERENCE ON AI CIRCUITS AND SYSTEMS, AICAS 2024, 2024, : 422 - 426
  • [10] Robust feature space separation for deep convolutional neural network training
    Sekmen A.
    Parlaktuna M.
    Abdul-Malek A.
    Erdemir E.
    Koku A.B.
    Discover Artificial Intelligence, 2021, 1 (01):