Gradient-free Stein variational gradient descent with kernel approximation

被引:4
|
作者
Yan, Liang [1 ,2 ]
Zou, Xiling [1 ]
机构
[1] Southeast Univ, Sch Math, Nanjing 210096, Peoples R China
[2] Nanjing Ctr Appl Math, Nanjing 211135, Peoples R China
关键词
Stein variational gradient descent; Bayesian inversion; Surrogate modeling; INFERENCE;
D O I
10.1016/j.aml.2021.107465
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Stein variational gradient descent (SVGD) has been shown powerful as a nonpara-metric variational inference algorithm for general purposes. However, the standard SVGD necessitates calculating the gradient of the target density and therefore cannot be used where the gradient is unavailable or too expensive to evaluate. A gradient-free variant (GF-SVGD) has been proposed to substitute the gradient with a surrogate, however, the major computational challenge of computing the forward model still prohibits the use of SVGD in inferencing complex distributions. In this paper, we address this issue by evaluating the forward model at only a limited number of points and build its approximation using pre-calculated kernels to keep the computational cost as low as possible. Our approximation method is then combined with an adaptation strategy that automatically refines the model by selecting particles at critical local locations, increasing precision at a low cost. We observe significant computational gains over the original SVGD and GF-SVGD algorithms. (C) 2021 Elsevier Ltd. All rights reserved.
引用
收藏
页数:7
相关论文
共 50 条
  • [31] Accelerating Convergence of Stein Variational Gradient Descent via Deep Unfolding
    Kawamura, Yuya
    Takabe, Satoshi
    IEEE ACCESS, 2024, 12 : 177911 - 177918
  • [32] A Modified Stein Variational Inference Algorithm with Bayesian and Gradient Descent Techniques
    Zhang, Limin
    Dong, Jing
    Zhang, Junfang
    Yang, Junzi
    SYMMETRY-BASEL, 2022, 14 (06):
  • [33] Towards Understanding the Dynamics of Gaussian-Stein Variational Gradient Descent
    Liu, Tianle
    Ghosal, Promit
    Balasubramanian, Krishnakumar
    Pillai, Natesh S.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [34] Nonlinear Stein Variational Gradient Descent for Learning Diversified Mixture Models
    Wang, Dilin
    Liu, Qiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [35] SCALING LIMIT OF THE STEIN VARIATIONAL GRADIENT DESCENT: THE MEAN FIELD REGIME
    Lu, Jianfeng
    Lu, Yulong
    Nolen, James
    SIAM JOURNAL ON MATHEMATICAL ANALYSIS, 2019, 51 (02) : 648 - 671
  • [36] Efficient Gradient-Free Variational Inference using Policy Search
    Arenz, Oleg
    Zhong, Mingjun
    Neumann, Gerhard
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [37] Stein Variational Policy Gradient
    Liu, Yang
    Ramachandran, Prajit
    Liu, Qiang
    Peng, Jian
    CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (UAI2017), 2017,
  • [38] Distributed Randomized Gradient-Free Mirror Descent Algorithm for Constrained Optimization
    Yu, Zhan
    Ho, Daniel W. C.
    Yuan, Deming
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2022, 67 (02) : 957 - 964
  • [39] A conjecture on global optimization using gradient-free stochastic approximation
    Maryak, JL
    Chin, DC
    JOINT CONFERENCE ON THE SCIENCE AND TECHNOLOGY OF INTELLIGENT SYSTEMS, 1998, : 441 - 445
  • [40] Federated Generalized Bayesian Learning via Distributed Stein Variational Gradient Descent
    Kassab, Rahif
    Simeone, Osvaldo
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 2180 - 2192