Nested Expectation Propagation for Gaussian Process Classification with a Multinomial Probit Likelihood

被引:0
作者
Riihimaki, Jaakko [1 ]
Jylanki, Pasi [1 ]
Vehtari, Aki [1 ]
机构
[1] Aalto Univ, Sch Sci, Dept Biomed Engn & Computat Sci, FI-00076 Aalto, Finland
基金
芬兰科学院;
关键词
Gaussian process; multiclass classification; multinomial probit; approximate inference; expectation propagation; APPROXIMATIONS; INFERENCE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper considers probabilistic multinomial probit classification using Gaussian process (GP) priors. Challenges with multiclass GP classification are the integration over the non-Gaussian posterior distribution, and the increase of the number of unknown latent variables as the number of target classes grows. Expectation propagation (EP) has proven to be a very accurate method for approximate inference but the existing EP approaches for the multinomial probit GP classification rely on numerical quadratures, or independence assumptions between the latent values associated with different classes, to facilitate the computations. In this paper we propose a novel nested EP approach which does not require numerical quadratures, and approximates accurately all between-class posterior dependencies of the latent values, but still scales linearly in the number of classes. The predictive accuracy of the nested EP approach is compared to Laplace, variational Bayes, and Markov chain Monte Carlo (MCMC) approximations with various benchmark data sets. In the experiments nested EP was the most consistent method compared to MCMC sampling, but in terms of classification accuracy the differences between all the methods were small from a practical point of view.
引用
收藏
页码:75 / 109
页数:35
相关论文
共 50 条
[41]   Kernel Selection for Gaussian Process in Cosmology: With Approximate Bayesian Computation Rejection and Nested Sampling [J].
Zhang, Hao ;
Wang, Yu-Chen ;
Zhang, Tong-Jie ;
Zhang, Tingting .
ASTROPHYSICAL JOURNAL SUPPLEMENT SERIES, 2023, 266 (02)
[42]   Non-Gaussian Data Clustering via Expectation Propagation Learning of Finite Dirichlet Mixture Models and Applications [J].
Wentao Fan ;
Nizar Bouguila .
Neural Processing Letters, 2014, 39 :115-135
[43]   Non-Gaussian Data Clustering via Expectation Propagation Learning of Finite Dirichlet Mixture Models and Applications [J].
Fan, Wentao ;
Bouguila, Nizar .
NEURAL PROCESSING LETTERS, 2014, 39 (02) :115-135
[44]   Multi-class Gaussian Process Classification with Noisy Inputs [J].
Villacampa-Calvo, Carlos ;
Zaldivar, Bryan ;
Garrido-Merchan, Eduardo C. ;
Hernandez-Lobato, Daniel .
JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22
[45]   Prediction of Crime Occurrence using Information Propagation Model and Gaussian Process [J].
Morimoto, Shusuke ;
Kawamukai, Hajime ;
Shin, Kilho .
2019 14TH ASIA JOINT CONFERENCE ON INFORMATION SECURITY (ASIAJCIS 2019), 2019, :80-87
[46]   VARIATIONAL GAUSSIAN PROCESS FOR MISSING LABEL CROWDSOURCING CLASSIFICATION PROBLEMS [J].
Ruiz, Pablo ;
Besler, Emre ;
Molina, Rafael ;
Katsaggelos, Aggelos K. .
2016 IEEE 26TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2016,
[47]   Efficient Gaussian process classification using random decision forests [J].
Fröhlich B. ;
Rodner E. ;
Kemmler M. ;
Denzler J. .
Pattern Recognition and Image Analysis, 2011, 21 (2) :184-187
[48]   SPECTRAL-SPATIAL CLASSIFICATION OF HYPERSPECTRAL IMAGES WITH GAUSSIAN PROCESS [J].
Sun, Shujin ;
Zhong, Ping ;
Xiao, Huaitie ;
Chen, Yuting ;
Gong, Zhiqiang ;
Wang, Runsheng .
2016 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2016, :473-476
[49]   Gaussian process classification of superparamagnetic relaxometry data: Phantom study [J].
Sovizi, Javad ;
Mathieu, Kelsey B. ;
Thrower, Sara L. ;
Stefan, Wolfgang ;
Hazle, John D. ;
Fuentes, David .
ARTIFICIAL INTELLIGENCE IN MEDICINE, 2017, 82 :47-59
[50]   Bayesian Gaussian process classification with the EM-EP algorithm [J].
Kim, Hyun-Chul ;
Ghahramani, Zoubin .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2006, 28 (12) :1948-1959