Pruning neural networks for inductive conformal prediction

被引:0
作者
Zhao, Xindi [1 ]
Bellotti, Anthony [1 ]
机构
[1] Univ Nottingham, Sch Comp Sci, Ningbo, Peoples R China
来源
CONFORMAL AND PROBABILISTIC PREDICTION WITH APPLICATIONS, VOL 179 | 2022年 / 179卷
关键词
Inductive conformal prediction; Neural networks; Pruning; Overparameterization;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural network pruning techniques are used to prune redundant parameters in overparameterized neural networks in order to compress the model size and reduce computational cost. The goal is to prune a neural network in such a way that it has the same, or nearly the same, predictive performance as the original. In this paper we study neural network pruning in the context of conformal prediction. In order to explore whether the neural network can be pruned while maintaining the predictive efficiency of conformal predictors, our work measures and compares the efficiency of the prediction sets provided by the inductive conformal predictor built with an underlying pruned neural network. We implement several existing pruning methods and propose a new pruning method based specifically on the conformal prediction framework. By evaluating with various neural network architectures and across several data sets, we find that the pruned network can maintain, or indeed improve, the efficiency of the conformal predictors up to a particular pruning ratio and this pruning ratio varies with different architectures and data sets. These results are instructive for deploying pruned neural network in real-work applications within the context of conformal predictors, where reliable predictions and reduced computational cost are relevant, e.g. in healthcare or safety-critical applications. This work is also relevant for further work applying continual learning techniques in the context of conformal predictors.
引用
收藏
页数:21
相关论文
共 34 条
[1]   Comparative accuracies of artificial neural networks and discriminant analysis in predicting forest cover types from cartographic variables [J].
Blackard, JA ;
Dean, DJ .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 1999, 24 (03) :131-151
[2]  
Blalock D., 2020, Proc. Mach. Learn. Syst., V2, P129
[3]  
Blundell C, 2015, PR MACH LEARN RES, V37, P1613
[4]  
Chang XY, 2020, Arxiv, DOI arXiv:2012.08749
[5]   Quantifying and leveraging predictive uncertainty for medical image assessment [J].
Ghesu, Florin C. ;
Georgescu, Bogdan ;
Mansoor, Awais ;
Yoo, Youngjin ;
Gibson, Eli ;
Vishwanath, R. S. ;
Balachandran, Abishek ;
Balter, James M. ;
Cao, Yue ;
Singh, Ramandeep ;
Digumarthy, Subba R. ;
Kalra, Mannudeep K. ;
Grbic, Sasa ;
Comaniciu, Dorin .
MEDICAL IMAGE ANALYSIS, 2021, 68
[6]  
Graves A., 2011, Adv. Neural Inf. Process. Syst., P2348, DOI 10.5555/2986459.2986721
[7]  
Han S, 2015, ADV NEUR IN, V28
[8]  
HASSIBI B, 1993, 1993 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, P293, DOI 10.1109/ICNN.1993.298572
[9]   Channel Pruning for Accelerating Very Deep Neural Networks [J].
He, Yihui ;
Zhang, Xiangyu ;
Sun, Jian .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, :1398-1406
[10]  
Hofmanninger Johannes, 2020, Medical Image Computing and Computer Assisted Intervention - MICCAI 2020. 23rd International Conference. Proceedings. Lecture Notes in Computer Science (LNCS 12262), P359, DOI 10.1007/978-3-030-59713-9_35