Class-Separation Preserving Pruning for Deep Neural Networks

被引:0
作者
Preet I. [1 ,2 ]
Boydell O. [1 ]
John D. [3 ]
机构
[1] University College Dublin, CeADAR - Ireland's Centre for Applied AI, Dublin
[2] Eaton Corporation Plc., Dublin
[3] University College Dublin, School of Electrical and Electronics Engineering, Dublin
来源
IEEE Transactions on Artificial Intelligence | 2024年 / 5卷 / 01期
关键词
Class-separation score (CSS); deep neural networks (DNNs); pruning; structured pruning;
D O I
10.1109/TAI.2022.3228511
中图分类号
学科分类号
摘要
Neural network pruning has been deemed essential in the deployment of deep neural networks on resource-constrained edge devices, greatly reducing the number of network parameters without drastically compromising accuracy. A class of techniques proposed in the literature assigns an importance score to each parameter and prunes those of the least importance. However, most of these methods are based on generalized estimations of the importance of each parameter, ignoring the context of the specific task at hand. In this article, we propose a task specific pruning approach, CSPrune, which is based on how efficiently a neuron or a convolutional filter is able to separate classes. Our axiomatic approach assigns an importance score based on how separable different classes are in the output activations or feature maps, preserving the separation of classes which avoids the reduction in classification accuracy. Additionally, most pruning algorithms prune individual connections or weights leading to a sparse network without taking into account whether the hardware the network is deployed on can take advantage of that sparsity or not. CSPrune prunes whole neurons or filters which results in a more structured pruned network whose sparsity can be more efficiently utilized by the hardware. We evaluate our pruning method against various benchmark datasets, both small and large, and network architectures and show that our approach outperforms comparable pruning techniques. © 2020 IEEE.
引用
收藏
页码:290 / 299
页数:9
相关论文
共 50 条
  • [21] A FRAMEWORK FOR PRUNING DEEP NEURAL NETWORKS USING ENERGY-BASED MODELS
    Salehinejad, Hojjat
    Valaee, Shahrokh
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3920 - 3924
  • [22] Noise-Tolerant Hardware-Aware Pruning for Deep Neural Networks
    Lu, Shun
    Chen, Cheng
    Zhang, Kunlong
    Zheng, Yang
    Hu, Zheng
    Hong, Wenjing
    Li, Guiying
    Yao, Xin
    ADVANCES IN SWARM INTELLIGENCE, ICSI 2023, PT II, 2023, 13969 : 127 - 138
  • [23] Smart Pruning of Deep Neural Networks Using Curve Fitting and Evolution of Weights
    Islam, Ashhadul
    Belhaouari, Samir Brahim
    MACHINE LEARNING, OPTIMIZATION, AND DATA SCIENCE, LOD 2022, PT II, 2023, 13811 : 62 - 76
  • [24] EvoPruneDeepTL: An evolutionary pruning model for transfer learning based deep neural networks
    Poyatos, Javier
    Molina, Daniel
    Martinez, Aritz D.
    Del Ser, Javier
    Herrera, Francisco
    NEURAL NETWORKS, 2023, 158 : 59 - 82
  • [25] Fine-Pruning: Defending Against Backdooring Attacks on Deep Neural Networks
    Liu, Kang
    Dolan-Gavitt, Brendan
    Garg, Siddharth
    RESEARCH IN ATTACKS, INTRUSIONS, AND DEFENSES, RAID 2018, 2018, 11050 : 273 - 294
  • [26] Structured Pruning for Deep Neural Networks with Adaptive Pruning Rate Derivation Based on Connection Sensitivity and Loss Function
    Sakai, Yasufumi
    Eto, Yu
    Teranishi, Yuta
    JOURNAL OF ADVANCES IN INFORMATION TECHNOLOGY, 2022, 13 (03) : 295 - 300
  • [27] Pruning by explaining: A novel criterion for deep neural network pruning
    Yeom, Seul-Ki
    Seegerer, Philipp
    Lapuschkin, Sebastian
    Binder, Alexander
    Wiedemann, Simon
    Mueller, Klaus-Robert
    Samek, Wojciech
    PATTERN RECOGNITION, 2021, 115
  • [28] GROWING AND PRUNING NEURAL TREE NETWORKS
    SANKAR, A
    MAMMONE, RJ
    IEEE TRANSACTIONS ON COMPUTERS, 1993, 42 (03) : 291 - 299
  • [29] Multiobjective evolutionary pruning of Deep Neural Networks with Transfer Learning for improving their performance and robustness
    Poyatos, Javier
    Molina, Daniel
    Martinez-Seras, Aitor
    Del Ser, Javier
    Herrera, Francisco
    APPLIED SOFT COMPUTING, 2023, 147
  • [30] Multi-objective pruning of dense neural networks using deep reinforcement learning
    Hirsch, Lior
    Katz, Gilad
    INFORMATION SCIENCES, 2022, 610 : 381 - 400