Class-dependent Pruning of Deep Neural Networks

被引:3
作者
Entezari, Rahim [1 ]
Saukh, Olga [1 ]
机构
[1] Graz Univ Technol, Inst Tech Informat, CSH Vienna, Graz, Austria
来源
2020 IEEE SECOND WORKSHOP ON MACHINE LEARNING ON EDGE IN SENSOR SYSTEMS (SENSYS-ML 2020) | 2020年
关键词
deep neural network compression; pruning; lottery ticket hypothesis; data imbalance; class imbalance;
D O I
10.1109/SenSysML50931.2020.00010
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Today's deep neural networks require substantial computation resources for their training, storage and inference, which limits their effective use on resource-constrained devices. Many recent research activities explore different options for compressing and optimizing deep models. On the one hand, in many real-world applications we face the data imbalance challenge, i.e., when the number of labeled instances of one class considerably outweighs the number of labeled instances of the other class. On the other hand, applications may pose a class imbalance problem, i.e., higher number of false positives produced when training a model and optimizing its performance may be tolerable, yet the number of false negatives must stay low. The problem originates from the fact that some classes are more important for the application than others, e.g., detection problems in medical and surveillance domains. Motivated by the success of the lottery ticket hypothesis, in this paper we propose an iterative deep model compression technique, which keeps the number of false negatives of the compressed model close to the one of the original model at the price of increasing the number of false positives if necessary. Our experimental evaluation using two benchmark data sets shows that the resulting compressed sub-networks 1) achieve up to 35% lower number of false negatives than the compressed model without class optimization, 2) provide an overall higher AUC-ROC measure compared to conventional Lottery Ticket algorithm and three recent popular pruning methods, and 3) use up to 99% fewer parameters compared to the original network. The code is publicly available(1).
引用
收藏
页码:13 / 18
页数:6
相关论文
共 50 条
[41]   Dimensionality reduced training by pruning and freezing parts of a deep neural network: a survey [J].
Paul Wimmer ;
Jens Mehnert ;
Alexandru Paul Condurache .
Artificial Intelligence Review, 2023, 56 :14257-14295
[42]   Dimensionality reduced training by pruning and freezing parts of a deep neural network: a survey [J].
Wimmer, Paul ;
Mehnert, Jens ;
Condurache, Alexandru Paul .
ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (12) :14257-14295
[43]   Joint Pruning and Channel-Wise Mixed-Precision Quantization for Efficient Deep Neural Networks [J].
Motetti, Beatrice Alessandra ;
Risso, Matteo ;
Burrello, Alessio ;
Macii, Enrico ;
Poncino, Massimo ;
Pagliari, Daniele Jahier .
IEEE TRANSACTIONS ON COMPUTERS, 2024, 73 (11) :2619-2633
[44]   Adaptive training and pruning for neural networks: algorithms and application [J].
Chen, S ;
Chang, SJ ;
Yuan, JH ;
Zhang, YX ;
Wong, KW .
ACTA PHYSICA SINICA, 2001, 50 (04) :674-681
[45]   Self-organizing neural networks by construction and pruning [J].
Lee, JS ;
Lee, H ;
Kim, JY ;
Nam, D ;
Park, CH .
IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2004, E87D (11) :2489-2498
[46]   Pruning feature maps for efficient convolutional neural networks [J].
Guo, Xiao-ting ;
Xie, Xin-shu ;
Lang, Xun .
OPTIK, 2023, 281
[47]   Sensitivity-Informed Provable Pruning of Neural Networks [J].
Baykal, Cenk ;
Liebenwein, Lucas ;
Gilitschenski, Igor ;
Feldman, Dan ;
Rus, Daniela .
SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2022, 4 (01) :26-45
[48]   Convolutional Neural Networks Pruning Based on Energy Adaptation [J].
Guo, Fan ;
Guo, Xing ;
Du, Haohua .
2024 10TH INTERNATIONAL CONFERENCE ON BIG DATA COMPUTING AND COMMUNICATIONS, BIGCOM, 2024, :111-117
[49]   Reconstruction Error Aware Pruning for Accelerating Neural Networks [J].
Kamma, Koji ;
Wada, Toshikazu .
ADVANCES IN VISUAL COMPUTING, ISVC 2019, PT I, 2020, 11844 :59-72
[50]   Evaluating Extended Pruning on Object Detection Neural Networks [J].
O'Keeffe, Simon ;
Villing, Rudi .
2018 29TH IRISH SIGNALS AND SYSTEMS CONFERENCE (ISSC), 2018,