Task-Agnostic Out-of-Distribution Detection Using Kernel Density Estimation

被引:1
作者
Erdil, Ertunc [1 ]
Chaitanya, Krishna [1 ]
Karani, Neerav [1 ]
Konukoglu, Ender [1 ]
机构
[1] Swiss Fed Inst Technol, Comp Vis Lab, Sternwartstr 7, CH-8092 Zurich, Switzerland
来源
UNCERTAINTY FOR SAFE UTILIZATION OF MACHINE LEARNING IN MEDICAL IMAGING, AND PERINATAL IMAGING, PLACENTAL AND PRETERM IMAGE ANALYSIS | 2021年 / 12959卷
关键词
Out-of-distribution detection; Kernel density estimation; SHAPE PRIORS;
D O I
10.1007/978-3-030-87735-4_9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the recent years, researchers proposed a number of successful methods to perform out-of-distribution (OOD) detection in deep neural networks (DNNs). So far the scope of the highly accurate methods has been limited to image level classification tasks. However, attempts for generally applicable methods beyond classification did not attain similar performance. In this paper, we address this limitation by proposing a simple yet effective task-agnostic OOD detection method. We estimate the probability density functions (pdfs) of intermediate features of a pre-trained DNN by performing kernel density estimation (KDE) on the training dataset. As direct application of KDE to feature maps is hindered by their high dimensionality, we use a set of lower-dimensional marginalized KDE models instead of a single high-dimensional one. At test time, we evaluate the pdfs on a test sample and produce a confidence score that indicates the sample is OOD. The use of KDE eliminates the need for making simplifying assumptions about the underlying feature pdfs and makes the proposed method task-agnostic. We perform experiments on classification task using computer vision benchmark datasets. Additionally, we perform experiments on medical image segmentation task using brain MRI datasets. The results demonstrate that the proposed method consistently achieves high OOD detection performance in both classification and segmentation tasks and improves state-of-the-art in almost all cases. Our code is available at littps://github.com/eerdilitask_agnostic_ ood. Longer version of the paper and supplementary materials can be found as preprint in [8].
引用
收藏
页码:91 / 101
页数:11
相关论文
共 36 条
[1]  
Amodei D, 2016, Arxiv, DOI [arXiv:1606.06565, DOI 10.48550/ARXIV.1606.06565]
[2]  
[Anonymous], 2017, P INT C LEARNING REP
[3]   NOVELTY DETECTION AND NEURAL-NETWORK VALIDATION [J].
BISHOP, CM .
IEE PROCEEDINGS-VISION IMAGE AND SIGNAL PROCESSING, 1994, 141 (04) :217-222
[4]  
Chaitanya Krishna, 2020, Adv. Neural Inf. Process. Syst., V33, P12546
[5]   Kernel density estimation and intrinsic alignment for shape priors in level set segmentation [J].
Cremers, Daniel ;
Osher, Stanley J. ;
Soatto, Stefano .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2006, 69 (03) :335-351
[6]  
Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848
[7]  
DeVries T, 2018, Arxiv, DOI [arXiv:1802.04865, DOI 10.48550/ARXIV.1802.04865, 10.48550/arXiv.1802.04865]
[8]   The autism brain imaging data exchange: towards a large-scale evaluation of the intrinsic brain architecture in autism [J].
Di Martino, A. ;
Yan, C-G ;
Li, Q. ;
Denio, E. ;
Castellanos, F. X. ;
Alaerts, K. ;
Anderson, J. S. ;
Assaf, M. ;
Bookheimer, S. Y. ;
Dapretto, M. ;
Deen, B. ;
Delmonte, S. ;
Dinstein, I. ;
Ertl-Wagner, B. ;
Fair, D. A. ;
Gallagher, L. ;
Kennedy, D. P. ;
Keown, C. L. ;
Keysers, C. ;
Lainhart, J. E. ;
Lord, C. ;
Luna, B. ;
Menon, V. ;
Minshew, N. J. ;
Monk, C. S. ;
Mueller, S. ;
Mueller, R. A. ;
Nebel, M. B. ;
Nigg, J. T. ;
O'Hearn, K. ;
Pelphrey, K. A. ;
Peltier, S. J. ;
Rudie, J. D. ;
Sunaert, S. ;
Thioux, M. ;
Tyszka, J. M. ;
Uddin, L. Q. ;
Verhoeven, J. S. ;
Wenderoth, N. ;
Wiggins, J. L. ;
Mostofsky, S. H. ;
Milham, M. P. .
MOLECULAR PSYCHIATRY, 2014, 19 (06) :659-667
[9]  
Erdil E, 2021, Arxiv, DOI arXiv:2006.10712
[10]   Pseudo-Marginal MCMC Sampling for Image Segmentation Using Nonparametric Shape Priors [J].
Erdil, Ertunc ;
Yildirim, Sinan ;
Tasdizen, Tolga ;
Cetin, Mujdat .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (11) :5702-5715