LEARNING GAUSSIAN PROCESSES WITH BAYESIAN POSTERIOR OPTIMIZATION

被引:0
|
作者
Chamon, Luiz F. O. [1 ]
Patemain, Santiago [1 ]
Ribeiro, Alejandro [1 ]
机构
[1] Univ Penn, Elect & Syst Engn, Philadelphia, PA 19104 USA
关键词
D O I
10.1109/ieeeconf44664.2019.9048819
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Gaussian processes (GPs) are often used as prior distributions in non-parametric Bayesian methods due to their numerical and analytical tractability. GP priors are specified by choosing a covariance function (along with its hyperparameters), a choice that is not only challenging in practice, but also has a profound impact on performance. This issue is typically overcome using hierarchical models, i.e., by learning a distribution over covariance functions/hyperparameters that defines a mixture of GPs. Yet, since choosing priors for hyperparameters can be challenging, maximum likelihood is often used instead to obtain point estimates. This approach, however, involves solving a non-convex optimization problem and is thus prone to overfitting. To address these issues, this work proposes a hybrid Bayesian-optimization solution in which the hyperparameters posterior distribution is obtained not using Bayes rule, but as the solution of a mathematical program. Explicitly, we obtain the hyperparameter distribution that minimizes a risk measure induced by the GP mixture. Previous knowledge, including properties such as sparsity and maximum entropy, is incorporated through (possibly non-convex) penalties instead of a prior. We prove that despite its infinite dimensionality and potential non-convexity, this problem can be solved exactly using duality and stochastic optimization.
引用
收藏
页码:482 / 486
页数:5
相关论文
共 50 条
  • [31] Dealing with categorical and integer-valued variables in Bayesian Optimization with Gaussian processes
    Garrido-Merchan, Eduardo C.
    Hernandez-Lobato, Daniel
    NEUROCOMPUTING, 2020, 380 : 20 - 35
  • [32] CONSTRAINED BAYESIAN OPTIMIZATION METHODS USING REGRESSION AND CLASSIFICATION GAUSSIAN PROCESSES AS CONSTRAINTS
    Jetton, Cole
    Li, Chengda
    Hoyle, Christopher
    PROCEEDINGS OF ASME 2023 INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, IDETC-CIE2023, VOL 3B, 2023,
  • [33] High-dimensional Bayesian optimization with projections using quantile Gaussian processes
    Moriconi, Riccardo
    Kumar, K. S. Sesh
    Deisenroth, Marc Peter
    OPTIMIZATION LETTERS, 2020, 14 (01) : 51 - 64
  • [34] Multi-Agent Collaborative Bayesian Optimization via Constrained Gaussian Processes
    Chen, Qiyuan
    Jiang, Liangkui
    Qin, Hantang
    Al Kontar, Raed
    TECHNOMETRICS, 2025, 67 (01) : 32 - 45
  • [35] Posterior and Computational Uncertainty in Gaussian Processes
    Wenger, Jonathan
    Pleiss, Geoff
    Pfoertner, Marvin
    Hennig, Philipp
    Cunningham, John P.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [36] GPstuff: Bayesian Modeling with Gaussian Processes
    Vanhatalo, Jarno
    Riihimaki, Jaakko
    Hartikainen, Jouni
    Jylanki, Pasi
    Tolvanen, Ville
    Vehtari, Aki
    JOURNAL OF MACHINE LEARNING RESEARCH, 2013, 14 : 1175 - 1179
  • [37] Gaussian Processes and Bayesian Moment Estimation
    Florens, Jean-Pierre
    Simoni, Anna
    JOURNAL OF BUSINESS & ECONOMIC STATISTICS, 2021, 39 (02) : 482 - 492
  • [38] Multi-Objective Bayesian Optimization using Deep Gaussian Processes with Applications to Copper Smelting Optimization
    Kang, Liwen
    Wang, Xuelei
    Wu, Zhiheng
    Wang, Ruihua
    2022 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2022, : 728 - 734
  • [39] Exploring Sparse Gaussian Processes for Bayesian Optimization in Convolutional Neural Networks for Autism Classification
    Cheekaty, Suresh
    Muneeswari, G.
    IEEE ACCESS, 2024, 12 : 10631 - 10651
  • [40] How to Encode Dynamic Gaussian Bayesian Networks as Gaussian Processes?
    Hartwig, Mattis
    Moeller, Ralf
    AI 2020: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 12576 : 371 - 382