Nested Expectation Propagation for Gaussian Process Classification with a Multinomial Probit Likelihood

被引:0
作者
Riihimaki, Jaakko [1 ]
Jylanki, Pasi [1 ]
Vehtari, Aki [1 ]
机构
[1] Aalto Univ, Sch Sci, Dept Biomed Engn & Computat Sci, FI-00076 Aalto, Finland
基金
芬兰科学院;
关键词
Gaussian process; multiclass classification; multinomial probit; approximate inference; expectation propagation; APPROXIMATIONS; INFERENCE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper considers probabilistic multinomial probit classification using Gaussian process (GP) priors. Challenges with multiclass GP classification are the integration over the non-Gaussian posterior distribution, and the increase of the number of unknown latent variables as the number of target classes grows. Expectation propagation (EP) has proven to be a very accurate method for approximate inference but the existing EP approaches for the multinomial probit GP classification rely on numerical quadratures, or independence assumptions between the latent values associated with different classes, to facilitate the computations. In this paper we propose a novel nested EP approach which does not require numerical quadratures, and approximates accurately all between-class posterior dependencies of the latent values, but still scales linearly in the number of classes. The predictive accuracy of the nested EP approach is compared to Laplace, variational Bayes, and Markov chain Monte Carlo (MCMC) approximations with various benchmark data sets. In the experiments nested EP was the most consistent method compared to MCMC sampling, but in terms of classification accuracy the differences between all the methods were small from a practical point of view.
引用
收藏
页码:75 / 109
页数:35
相关论文
共 50 条
  • [21] Hierarchical Deep Gaussian Processes Latent Variable Model via Expectation Propagation
    Taubert, Nick
    Giese, Martin A.
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT III, 2021, 12893 : 317 - 329
  • [22] Domain Adaptation for Gaussian Process Classification
    Yang, Kai
    Wan, Wanggen
    Lu, Jie
    2018 INTERNATIONAL CONFERENCE ON AUDIO, LANGUAGE AND IMAGE PROCESSING (ICALIP), 2018, : 226 - 229
  • [23] Approximations for Binary Gaussian Process Classification
    Nickisch, Hannes
    Rasmussen, Carl Edward
    JOURNAL OF MACHINE LEARNING RESEARCH, 2008, 9 : 2035 - 2078
  • [24] Robust Gaussian process regression with G-confluent likelihood
    Lindfors, Martin
    Chen, Tianshi
    Naesseth, Christian A.
    IFAC PAPERSONLINE, 2020, 53 (02): : 401 - 406
  • [25] Alpha divergence minimization in multi-class Gaussian process classification
    Villacampa-Calvo, Carlos
    Hernandez-Lobato, Daniel
    NEUROCOMPUTING, 2020, 378 : 210 - 227
  • [26] Nested polynomial trends for the improvement of Gaussian process-based predictors
    Perrin, G.
    Soize, C.
    Marque-Pucheu, S.
    Garnier, J.
    JOURNAL OF COMPUTATIONAL PHYSICS, 2017, 346 : 389 - 402
  • [27] Gaussian Process Classification Using Posterior Linearization
    Garcia-Fernandez, Angel F.
    Tronarp, Filip
    Sarkka, Simo
    IEEE SIGNAL PROCESSING LETTERS, 2019, 26 (05) : 735 - 739
  • [28] Adversarial vulnerability bounds for Gaussian process classification
    Michael Thomas Smith
    Kathrin Grosse
    Michael Backes
    Mauricio A. Álvarez
    Machine Learning, 2023, 112 : 971 - 1009
  • [29] Scalable Large Margin Gaussian Process Classification
    Wistuba, Martin
    Rawat, Ambrish
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT II, 2020, 11907 : 501 - 516
  • [30] Gaussian process classification for variable fidelity data
    Klyuchnikov, Nikita
    Burnaev, Evgeny
    NEUROCOMPUTING, 2020, 397 (397) : 345 - 355