Nested Expectation Propagation for Gaussian Process Classification with a Multinomial Probit Likelihood

被引:0
作者
Riihimaki, Jaakko [1 ]
Jylanki, Pasi [1 ]
Vehtari, Aki [1 ]
机构
[1] Aalto Univ, Sch Sci, Dept Biomed Engn & Computat Sci, FI-00076 Aalto, Finland
基金
芬兰科学院;
关键词
Gaussian process; multiclass classification; multinomial probit; approximate inference; expectation propagation; APPROXIMATIONS; INFERENCE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper considers probabilistic multinomial probit classification using Gaussian process (GP) priors. Challenges with multiclass GP classification are the integration over the non-Gaussian posterior distribution, and the increase of the number of unknown latent variables as the number of target classes grows. Expectation propagation (EP) has proven to be a very accurate method for approximate inference but the existing EP approaches for the multinomial probit GP classification rely on numerical quadratures, or independence assumptions between the latent values associated with different classes, to facilitate the computations. In this paper we propose a novel nested EP approach which does not require numerical quadratures, and approximates accurately all between-class posterior dependencies of the latent values, but still scales linearly in the number of classes. The predictive accuracy of the nested EP approach is compared to Laplace, variational Bayes, and Markov chain Monte Carlo (MCMC) approximations with various benchmark data sets. In the experiments nested EP was the most consistent method compared to MCMC sampling, but in terms of classification accuracy the differences between all the methods were small from a practical point of view.
引用
收藏
页码:75 / 109
页数:35
相关论文
共 50 条
[31]   Scalable Large Margin Gaussian Process Classification [J].
Wistuba, Martin ;
Rawat, Ambrish .
MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT II, 2020, 11907 :501-516
[32]   Adversarial vulnerability bounds for Gaussian process classification [J].
Smith, Michael Thomas ;
Grosse, Kathrin ;
Backes, Michael ;
Alvarez, Mauricio A. .
MACHINE LEARNING, 2023, 112 (03) :971-1009
[33]   Variational Gaussian process for multisensor classification problems [J].
Rohani, Neda ;
Ruiz, Pablo ;
Molina, Rafael ;
Katsaggelos, Aggelos K. .
PATTERN RECOGNITION LETTERS, 2018, 116 :80-87
[34]   A latent variable Gaussian process model with Pitman-Yor process priors for multiclass classification [J].
Chatzis, Sotirios P. .
NEUROCOMPUTING, 2013, 120 :482-489
[35]   Scalable Gaussian Process Classification With Additive Noise for Non-Gaussian Likelihoods [J].
Liu, Haitao ;
Ong, Yew-Soon ;
Yu, Ziwei ;
Cai, Jianfei ;
Shen, Xiaobo .
IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (07) :5842-5854
[36]   Assessing approximate inference for binary Gaussian process classification [J].
Kuss, M ;
Rasmussen, CE .
JOURNAL OF MACHINE LEARNING RESEARCH, 2005, 6 :1679-1704
[37]   Classification and Categorical Inputs with Treed Gaussian Process Models [J].
Broderick, Tamara ;
Gramacy, Robert B. .
JOURNAL OF CLASSIFICATION, 2011, 28 (02) :244-270
[38]   MUSIC GENRE CLASSIFICATION USING GAUSSIAN PROCESS MODELS [J].
Markov, Konstantin ;
Matsui, Tomoko .
2013 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2013,
[39]   Correlated Wishart matrices classification via an expectation-maximization composite likelihood-based algorithm [J].
Lan, Zhou .
STATISTICS AND ITS INTERFACE, 2024, 17 (02) :173-185
[40]   Gaussian process regression and classification using International Classification of Disease codes as covariates [J].
Srivastava, Sanvesh ;
Xu, Zongyi ;
Li, Yunyi ;
Street, W. Nick ;
Gilbertson-White, Stephanie .
STAT, 2023, 12 (01)