Regularized Stein Variational Gradient Flow

被引:0
|
作者
He, Ye [1 ]
Balasubramanian, Krishnakumar [2 ]
Sriperumbudur, Bharath K. [3 ]
Lu, Jianfeng [4 ]
机构
[1] Georgia Inst Technol, Sch Math, 686 Cherry St, Atlanta, GA 30332 USA
[2] Univ Calif Davis, Dept Stat, 399 Crocker Lane,1 Shields Ave, Davis, CA 95616 USA
[3] Penn State Univ, Dept Stat, 314 Thomas Bldg, University Pk, PA 16802 USA
[4] Duke Univ, Math Dept, Box 90320,120 Sci Dr, Durham, NC 27708 USA
关键词
Wasserstein gradient flow; Stein variational gradient descent; Particle-based sampling; Convergence to equilibrium; Mean-field analysis; Reproducing kernel Hilbert space; Regularization; CONVERGENCE; DIFFUSION; KERNELS;
D O I
10.1007/s10208-024-09663-w
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The stein variational gradient descent (SVGD) algorithm is a deterministic particle method for sampling. However, a mean-field analysis reveals that the gradient flow corresponding to the SVGD algorithm (i.e., the Stein Variational Gradient Flow) only provides a constant-order approximation to the Wasserstein gradient flow corresponding to the KL-divergence minimization. In this work, we propose the Regularized Stein Variational Gradient Flow, which interpolates between the Stein Variational Gradient Flow and the Wasserstein gradient flow. We establish various theoretical properties of the Regularized Stein Variational Gradient Flow (and its time-discretization) including convergence to equilibrium, existence and uniqueness of weak solutions, and stability of the solutions. We provide preliminary numerical evidence of the improved performance offered by the regularization.
引用
收藏
页数:59
相关论文
共 50 条
  • [21] Quantile Stein Variational Gradient Descent for Batch Bayesian Optimization
    Gong, Chengyue
    Peng, Jian
    Liu, Qiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [22] Variational Wasserstein gradient flow
    Fan, Jiaojiao
    Zhang, Qinsheng
    Taghvaei, Amirhossein
    Chen, Yongxin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [23] Neural Operator Variational Inference Based on Regularized Stein Discrepancy for Deep Gaussian Processes
    Xu, Jian
    Du, Shian
    Yang, Junmei
    Ma, Qianli
    Zeng, Delu
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [24] Unsupervised Anomaly Detection & Diagnosis: A Stein Variational Gradient Descent Approach
    Chen, Zhichao
    Ding, Leilei
    Huang, Jianmin
    Chu, Zhixuan
    Dai, Qingyang
    Wang, Hao
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 3783 - 3787
  • [25] Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm
    Liu, Qiang
    Wang, Dilin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [26] Convergence of Stein Variational Gradient Descent under a Weaker Smoothness Condition
    Sun, Lukang
    Karagulyan, Avetik
    Richtarik, Peter
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206, 2023, 206
  • [27] Multilevel Stein variational gradient descent with applications to Bayesian inverse problems
    Alsup, Terrence
    Venturi, Luca
    Peherstorfer, Benjamin
    MATHEMATICAL AND SCIENTIFIC MACHINE LEARNING, VOL 145, 2021, 145 : 93 - +
  • [28] SCALING LIMIT OF THE STEIN VARIATIONAL GRADIENT DESCENT: THE MEAN FIELD REGIME
    Lu, Jianfeng
    Lu, Yulong
    Nolen, James
    SIAM JOURNAL ON MATHEMATICAL ANALYSIS, 2019, 51 (02) : 648 - 671
  • [29] A Modified Stein Variational Inference Algorithm with Bayesian and Gradient Descent Techniques
    Zhang, Limin
    Dong, Jing
    Zhang, Junfang
    Yang, Junzi
    SYMMETRY-BASEL, 2022, 14 (06):
  • [30] Nonlinear Stein Variational Gradient Descent for Learning Diversified Mixture Models
    Wang, Dilin
    Liu, Qiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97