Learning to Prune Deep Neural Networks via Layer-wise Optimal Brain Surgeon

被引:0
作者
Dong, Xin [1 ]
Chen, Shangyu [1 ]
Pan, Sinno Jialin [1 ]
机构
[1] Nanyang Technol Univ, Singapore, Singapore
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017) | 2017年 / 30卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
How to develop slim and accurate deep neural networks has become crucial for real-world applications, especially for those employed in embedded systems. Though previous work along this research line has shown some promising results, most existing methods either fail to significantly compress a well-trained deep network or require a heavy retraining process for the pruned deep network to re-boost its prediction performance. In this paper, we propose a new layer-wise pruning method for deep neural networks. In our proposed method, parameters of each individual layer are pruned independently based on second order derivatives of a layer-wise error function with respect to the corresponding parameters. We prove that the final prediction performance drop after pruning is bounded by a linear combination of the reconstructed errors caused at each layer. By controlling layer-wise errors properly, one only needs to perform a light retraining process on the pruned network to resume its original prediction performance. We conduct extensive experiments on benchmark datasets to demonstrate the effectiveness of our pruning method compared with several state-of-the-art baseline methods. Codes of our work are released at: https://github.com/csyhhu/L-OBS.
引用
收藏
页数:11
相关论文
共 50 条
[41]   Deep Layer-wise Networks Have Closed-Form Weights [J].
Wu, Chieh ;
Masoomi, Aria ;
Gretton, Arthur ;
Dy, Jennifer .
INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151 :188-225
[42]   Explaining Therapy Predictions with Layer-wise Relevance Propagation in Neural Networks [J].
Yang, Yinchong ;
Tresp, Volker ;
Wunderle, Marius ;
Fasching, Peter A. .
2018 IEEE INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS (ICHI), 2018, :152-162
[43]   Layer-Wise Optimization of Contextual Neural Networks with Dynamic Field of Aggregation [J].
Jodlowiec, Marcin ;
Albu, Adriana ;
Wolk, Krzysztof ;
Nguyen Thai-Nghe ;
Karasinski, Adrian .
INTELLIGENT INFORMATION AND DATABASE SYSTEMS, ACIIDS 2022, PT II, 2022, 13758 :302-312
[44]   Layer-Wise Relevance Propagation for Neural Networks with Local Renormalization Layers [J].
Binder, Alexander ;
Montavon, Gregoire ;
Lapuschkin, Sebastian ;
Mueller, Klaus-Robert ;
Samek, Wojciech .
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT II, 2016, 9887 :63-71
[45]   Shallowing Deep Networks: Layer-wise Pruning based on Feature Representations [J].
Chen, Shi ;
Zhao, Qi .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2019, 41 (12) :3048-3056
[46]   Effects of depth, width, and initialization: A convergence analysis of layer-wise training for deep linear neural networks [J].
Shin, Yeonjong .
ANALYSIS AND APPLICATIONS, 2022, 20 (01) :73-119
[47]   Network with Sub-networks: Layer-wise Detachable Neural Network [J].
Fuengfusin, Ninnart ;
Tamukoh, Hakaru .
JOURNAL OF ROBOTICS NETWORKING AND ARTIFICIAL LIFE, 2021, 7 (04) :240-244
[48]   Guided Layer-Wise Learning for Deep Models Using Side Information [J].
Sulimov, Pavel ;
Sukmanova, Elena ;
Chereshnev, Roman ;
Kertesz-Farkas, Attila .
ANALYSIS OF IMAGES, SOCIAL NETWORKS AND TEXTS (AIST 2019), 2020, 1086 :50-61
[49]   Layer-wise learning based stochastic gradient descent method for the optimization of deep convolutional neural network [J].
Zheng, Qinghe ;
Tian, Xinyu ;
Jiang, Nan ;
Yang, Mingqiang .
JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2019, 37 (04) :5641-5654
[50]   Cost-Sensitive Deep Learning with Layer-Wise Cost Estimation [J].
Chung, Yu-An ;
Yang, Shao-Wen ;
Lin, Hsuan-Tien .
2020 25TH INTERNATIONAL CONFERENCE ON TECHNOLOGIES AND APPLICATIONS OF ARTIFICIAL INTELLIGENCE (TAAI 2020), 2020, :108-113