EXACT O(N2) HYPER-PARAMETER OPTIMIZATION FOR GAUSSIAN PROCESS REGRESSION

被引:0
|
作者
Xu, Linning [1 ,2 ]
Dai, Yijue [1 ,2 ]
Zhang, Jiawei [1 ,2 ]
Zhang, Ceyao [1 ,2 ]
Yin, Feng [1 ,2 ]
机构
[1] Chinese Univ Hong Kong, Shenzhen 518172, Peoples R China
[2] SRIBD, Shenzhen 518172, Peoples R China
关键词
Gaussian process; hyper-parameter optimization; ADMM; cross-validation; low complexity;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Hyper-parameter optimization remains as the core issue of Gaussian process (GP) for machine learning nowadays. The benchmark method using maximum likelihood (ML) estimation and gradient descent (GD) is impractical for processing big data due to its O(n(3)) complexity. Many sophisticated global or local approximation models have been proposed to address such complexity issue. In this paper, we propose two novel and exact GP hyper-parameter training schemes by replacing ML with cross-validation (CV) as the fitting criterion and replacing GD with a non-linearly constrained alternating direction method of multipliers (ADMM) as the optimization method. The proposed schemes are of O(n(2)) complexity for any covariance matrix without special structure. We conduct experiments based on synthetic and real datasets, wherein the proposed schemes show excellent performance in terms of convergence, hyper-parameter estimation, and computational time in comparison with the traditional ML based routines.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Hyper-Parameter Initialization for Squared Exponential Kernel-based Gaussian Process Regression
    Ulapane, Nalika
    Thiyagarajan, Karthick
    Kodagoda, Sarath
    PROCEEDINGS OF THE 15TH IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA 2020), 2020, : 1154 - 1159
  • [2] Random search for hyper-parameter optimization
    Département D'Informatique et de Recherche Opérationnelle, Université de Montréal, Montréal, QC, H3C 3J7, Canada
    J. Mach. Learn. Res., (281-305):
  • [3] Random Search for Hyper-Parameter Optimization
    Bergstra, James
    Bengio, Yoshua
    JOURNAL OF MACHINE LEARNING RESEARCH, 2012, 13 : 281 - 305
  • [4] Hyper-parameter Optimization for Latent Spaces
    Veloso, Bruno
    Caroprese, Luciano
    Konig, Matthias
    Teixeira, Sonia
    Manco, Giuseppe
    Hoos, Holger H.
    Gama, Joao
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: RESEARCH TRACK, PT III, 2021, 12977 : 249 - 264
  • [5] Federated learning with hyper-parameter optimization
    Kundroo, Majid
    Kim, Taehong
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2023, 35 (09)
  • [6] TRANSITIONAL ANNEALED ADAPTIVE SLICE SAMPLING FOR GAUSSIAN PROCESS HYPER-PARAMETER ESTIMATION
    Garbuno-Inigo, A.
    DiazDelaO, F. A.
    Zuev, K. M.
    INTERNATIONAL JOURNAL FOR UNCERTAINTY QUANTIFICATION, 2016, 6 (04) : 341 - 359
  • [7] Flood susceptibility mapping using support vector regression and hyper-parameter optimization
    Salvati, Aryan
    Nia, Alireza Moghaddam
    Salajegheh, Ali
    Ghaderi, Kayvan
    Asl, Dawood Talebpour
    Al-Ansari, Nadhir
    Solaimani, Feridon
    Clague, John J.
    JOURNAL OF FLOOD RISK MANAGEMENT, 2023, 16 (04):
  • [8] Gradient Hyper-parameter Optimization for Manifold Regularization
    Becker, Cassiano O.
    Ferreira, Paulo A. V.
    2013 12TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2013), VOL 2, 2013, : 339 - 344
  • [9] Gaussian process hyper-parameter estimation using Parallel Asymptotically Independent Markov Sampling
    Garbuno-Inigo, A.
    DiazDelaO, F. A.
    Zuev, K. M.
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2016, 103 : 367 - 383
  • [10] Bayesian Optimization for Accelerating Hyper-parameter Tuning
    Vu Nguyen
    2019 IEEE SECOND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND KNOWLEDGE ENGINEERING (AIKE), 2019, : 302 - 305