Convergence of Stein Variational Gradient Descent under a Weaker Smoothness Condition

被引:0
|
作者
Sun, Lukang [1 ]
Karagulyan, Avetik [1 ]
Richtarik, Peter [1 ]
机构
[1] KAUST, AI Inititat, Thuwal, Saudi Arabia
来源
INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206 | 2023年 / 206卷
关键词
BAYESIAN-INFERENCE; MONTE-CARLO; LANGEVIN; APPROXIMATIONS; SDES;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Stein Variational Gradient Descent (SVGD) is an important alternative to the Langevin-type algorithms for sampling from probability distributions of the form pi(x) alpha exp(-V(x)). In the existing theory of Langevin-type algorithms and SVGD, the potential function V is often assumed to be L-smooth. However, this restrictive condition excludes a large class of potential functions such as polynomials of degree greater than 2. Our paper studies the convergence of the SVGD algorithm in population limit for distributions with (L-0,L-1)-smooth potentials. This relaxed smoothness assumption was introduced by Zhang et al. (2019a) for the analysis of gradient clipping algorithms. With the help of trajectory-independent auxiliary conditions, we provide a descent lemma establishing that the algorithm decreases the KL divergence at each iteration and prove a complexity bound for SVGD in the population limit in terms of the Stein Fisher information.
引用
收藏
页数:25
相关论文
共 50 条
  • [21] Learning to Draw Samples with Amortized Stein Variational Gradient Descent
    Feng, Yihao
    Wang, Dilin
    Liu, Qiang
    CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (UAI2017), 2017,
  • [22] Density Estimation-Based Stein Variational Gradient Descent
    Kim, Jeongho
    Lee, Byungjoon
    Min, Chohong
    Park, Jaewoo
    Ryu, Keunkwan
    COGNITIVE COMPUTATION, 2025, 17 (01)
  • [23] Quantile Stein Variational Gradient Descent for Batch Bayesian Optimization
    Gong, Chengyue
    Peng, Jian
    Liu, Qiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [24] Stein Variational Gradient Descent with Matrix-Valued Kernels
    Wang, Dilin
    Tang, Ziyang
    Bajaj, Chandrajit
    Liu, Qiang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [25] CONVERGENCE AND STABILITY RESULTS FOR THE PARTICLE SYSTEM IN THE STEIN GRADIENT DESCENT
    Carrillo, Jose A.
    Skrzeczkowski, Jakub
    MATHEMATICS OF COMPUTATION, 2025,
  • [26] Unsupervised Anomaly Detection & Diagnosis: A Stein Variational Gradient Descent Approach
    Chen, Zhichao
    Ding, Leilei
    Huang, Jianmin
    Chu, Zhixuan
    Dai, Qingyang
    Wang, Hao
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 3783 - 3787
  • [27] Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm
    Liu, Qiang
    Wang, Dilin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [28] Multilevel Stein variational gradient descent with applications to Bayesian inverse problems
    Alsup, Terrence
    Venturi, Luca
    Peherstorfer, Benjamin
    MATHEMATICAL AND SCIENTIFIC MACHINE LEARNING, VOL 145, 2021, 145 : 93 - +
  • [29] A Modified Stein Variational Inference Algorithm with Bayesian and Gradient Descent Techniques
    Zhang, Limin
    Dong, Jing
    Zhang, Junfang
    Yang, Junzi
    SYMMETRY-BASEL, 2022, 14 (06):
  • [30] SCALING LIMIT OF THE STEIN VARIATIONAL GRADIENT DESCENT: THE MEAN FIELD REGIME
    Lu, Jianfeng
    Lu, Yulong
    Nolen, James
    SIAM JOURNAL ON MATHEMATICAL ANALYSIS, 2019, 51 (02) : 648 - 671