Class-dependent Pruning of Deep Neural Networks

被引:3
|
作者
Entezari, Rahim [1 ]
Saukh, Olga [1 ]
机构
[1] Graz Univ Technol, Inst Tech Informat, CSH Vienna, Graz, Austria
来源
2020 IEEE SECOND WORKSHOP ON MACHINE LEARNING ON EDGE IN SENSOR SYSTEMS (SENSYS-ML 2020) | 2020年
关键词
deep neural network compression; pruning; lottery ticket hypothesis; data imbalance; class imbalance;
D O I
10.1109/SenSysML50931.2020.00010
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Today's deep neural networks require substantial computation resources for their training, storage and inference, which limits their effective use on resource-constrained devices. Many recent research activities explore different options for compressing and optimizing deep models. On the one hand, in many real-world applications we face the data imbalance challenge, i.e., when the number of labeled instances of one class considerably outweighs the number of labeled instances of the other class. On the other hand, applications may pose a class imbalance problem, i.e., higher number of false positives produced when training a model and optimizing its performance may be tolerable, yet the number of false negatives must stay low. The problem originates from the fact that some classes are more important for the application than others, e.g., detection problems in medical and surveillance domains. Motivated by the success of the lottery ticket hypothesis, in this paper we propose an iterative deep model compression technique, which keeps the number of false negatives of the compressed model close to the one of the original model at the price of increasing the number of false positives if necessary. Our experimental evaluation using two benchmark data sets shows that the resulting compressed sub-networks 1) achieve up to 35% lower number of false negatives than the compressed model without class optimization, 2) provide an overall higher AUC-ROC measure compared to conventional Lottery Ticket algorithm and three recent popular pruning methods, and 3) use up to 99% fewer parameters compared to the original network. The code is publicly available(1).
引用
收藏
页码:13 / 18
页数:6
相关论文
共 50 条
  • [1] Class-Separation Preserving Pruning for Deep Neural Networks
    Preet I.
    Boydell O.
    John D.
    IEEE Transactions on Artificial Intelligence, 2024, 5 (01): : 290 - 299
  • [2] QLP: Deep Q-Learning for Pruning Deep Neural Networks
    Camci, Efe
    Gupta, Manas
    Wu, Min
    Lin, Jie
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (10) : 6488 - 6501
  • [3] Anonymous Model Pruning for Compressing Deep Neural Networks
    Zhang, Lechun
    Chen, Guangyao
    Shi, Yemin
    Zhang, Quan
    Tan, Mingkui
    Wang, Yaowei
    Tian, Yonghong
    Huang, Tiejun
    THIRD INTERNATIONAL CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL (MIPR 2020), 2020, : 161 - 164
  • [4] Trained Rank Pruning for Efficient Deep Neural Networks
    Xu, Yuhui
    Li, Yuxi
    Zhang, Shuai
    Wen, Wei
    Wang, Botao
    Dai, Wenrui
    Qi, Yingyong
    Chen, Yiran
    Lin, Weiyao
    Xiong, Hongkai
    FIFTH WORKSHOP ON ENERGY EFFICIENT MACHINE LEARNING AND COGNITIVE COMPUTING - NEURIPS EDITION (EMC2-NIPS 2019), 2019, : 14 - 17
  • [5] CUP: Cluster Pruning for Compressing Deep Neural Networks
    Duggal, Rahul
    Xiao, Cao
    Vuduc, Richard
    Duen Horng Chau
    Sun, Jimeng
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 5102 - 5106
  • [6] Deep Neural Networks Pruning via the Structured Perspective Regularization
    Cacciola, Matteo
    Frangioni, Antonio
    Li, Xinlin
    Lodi, Andrea
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2023, 5 (04): : 1051 - 1077
  • [7] Acceleration of Deep Convolutional Neural Networks Using Adaptive Filter Pruning
    Singh, Pravendra
    Verma, Vinay Kumar
    Rai, Piyush
    Namboodiri, Vinay P.
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2020, 14 (04) : 838 - 847
  • [8] Neuroplasticity-Based Pruning Method for Deep Convolutional Neural Networks
    Camacho, Jose David
    Villasenor, Carlos
    Lopez-Franco, Carlos
    Arana-Daniel, Nancy
    APPLIED SCIENCES-BASEL, 2022, 12 (10):
  • [9] Preemptively pruning Clever-Hans strategies in deep neural networks
    Linhardt, Lorenz
    Mueller, Klaus-Robert
    Montavon, Gregoire
    INFORMATION FUSION, 2024, 103
  • [10] Compression of Deep Convolutional Neural Networks Using Effective Channel Pruning
    Guo, Qingbei
    Wu, Xiao-Jun
    Zhao, Xiuyang
    IMAGE AND GRAPHICS, ICIG 2019, PT I, 2019, 11901 : 760 - 772