Neuroplasticity-Based Pruning Method for Deep Convolutional Neural Networks

被引:2
|
作者
Camacho, Jose David [1 ]
Villasenor, Carlos [1 ]
Lopez-Franco, Carlos [1 ]
Arana-Daniel, Nancy [1 ]
机构
[1] Univ Guadalajara, Dept Comp Sci, 1421 Marcelino Garcia Barragan, Guadalajara 44430, Jalisco, Mexico
来源
APPLIED SCIENCES-BASEL | 2022年 / 12卷 / 10期
关键词
pruning; neuroplasticity; deep learning; convolutional layers; transfer learning; PLASTICITY; CASCADE; CNN;
D O I
10.3390/app12104945
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
In this paper, a new pruning strategy based on the neuroplasticity of biological neural networks is presented. The novel pruning algorithm proposed is inspired by the knowledge remapping ability after injuries in the cerebral cortex. Thus, it is proposed to simulate induced injuries into the network by pruning full convolutional layers or entire blocks, assuming that the knowledge from the removed segments of the network may be remapped and compressed during the recovery (retraining) process. To reconnect the remaining segments of the network, a translator block is introduced. The translator is composed of a pooling layer and a convolutional layer. The pooling layer is optional and placed to ensure that the spatial dimension of the feature maps matches across the pruned segments. After that, a convolutional layer (simulating the intact cortex) is placed to ensure that the depth of the feature maps matches and is used to remap the removed knowledge. As a result, lightweight, efficient and accurate sub-networks are created from the base models. Comparison analysis shows that in our approach is not necessary to define a threshold or metric as the criterion to prune the network in contrast to other pruning methods. Instead, only the origin and destination of the prune and reconnection points must be determined for the translator connection.
引用
收藏
页数:27
相关论文
共 50 条
  • [1] Structured Pruning of Deep Convolutional Neural Networks
    Anwar, Sajid
    Hwang, Kyuyeon
    Sung, Wonyong
    ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2017, 13 (03)
  • [2] Activation Pruning of Deep Convolutional Neural Networks
    Ardakani, Arash
    Condo, Carlo
    Gross, Warren J.
    2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 1325 - 1329
  • [3] Entropy-based pruning method for convolutional neural networks
    Hur, Cheonghwan
    Kang, Sanggil
    JOURNAL OF SUPERCOMPUTING, 2019, 75 (06): : 2950 - 2963
  • [4] Entropy-based pruning method for convolutional neural networks
    Cheonghwan Hur
    Sanggil Kang
    The Journal of Supercomputing, 2019, 75 : 2950 - 2963
  • [5] DEEP LEARNING BASED METHOD FOR PRUNING DEEP NEURAL NETWORKS
    Li, Lianqiang
    Zhu, Jie
    Sun, Ming-Ting
    2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO WORKSHOPS (ICMEW), 2019, : 312 - 317
  • [6] A Filter Rank Based Pruning Method for Convolutional Neural Networks
    Liu, Hao
    Guan, Zhenyu
    Lei, Peng
    2021 IEEE 20TH INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2021), 2021, : 1318 - 1322
  • [7] Structured Pruning for Deep Convolutional Neural Networks: A Survey
    He, Yang
    Xiao, Lingao
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (05) : 2900 - 2919
  • [8] Generalized Gradient Flow Based Saliency for Pruning Deep Convolutional Neural Networks
    Xinyu Liu
    Baopu Li
    Zhen Chen
    Yixuan Yuan
    International Journal of Computer Vision, 2023, 131 : 3121 - 3135
  • [9] An optimal-score-based filter pruning for deep convolutional neural networks
    Sawant, Shrutika S.
    Bauer, J.
    Erick, F. X.
    Ingaleshwar, Subodh
    Holzer, N.
    Ramming, A.
    Lang, E. W.
    Goetz, Th
    APPLIED INTELLIGENCE, 2022, 52 (15) : 17557 - 17579
  • [10] Generalized Gradient Flow Based Saliency for Pruning Deep Convolutional Neural Networks
    Liu, Xinyu
    Li, Baopu
    Chen, Zhen
    Yuan, Yixuan
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2023, 131 (12) : 3121 - 3135