Subsampled Stochastic Variance-Reduced Gradient Langevin Dynamics

被引:0
|
作者
Zou, Difan [1 ]
Xu, Pan [1 ]
Gu, Quanquan [1 ]
机构
[1] Univ Calif Los Angeles, Dept Comp Sci, Los Angeles, CA 90095 USA
来源
UNCERTAINTY IN ARTIFICIAL INTELLIGENCE | 2018年
基金
美国国家科学基金会;
关键词
OPTIMIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Stochastic variance-reduced gradient Langevin dynamics (SVRG-LD) was recently proposed to improve the performance of stochastic gradient Langevin dynamics (SGLD) by reducing the variance of the stochastic gradient. In this paper, we propose a variant of SVRG-LD, namely SVRG-LD+, which replaces the full gradient in each epoch with a subsampled one. We provide a nonasymptotic analysis of the convergence of SVRG-LD+ in 2-Wasserstein distance, and show that SVRG-LD+ enjoys a lower gradient complexity(1) than SVRG-LD, when the sample size is large or the target accuracy requirement is moderate. Our analysis directly implies a sharper convergence rate for SVRG-LD, which improves the existing convergence rate by a factor of kappa(1/6)n(1/6), where kappa is the condition number of the log-density function and n is the sample size. Experiments on both synthetic and real-world datasets validate our theoretical results.
引用
收藏
页码:508 / 518
页数:11
相关论文
共 50 条
  • [1] Variance-reduced random batch Langevin dynamics
    Xu, Zhenli
    Zhao, Yue
    Zhou, Qi
    JOURNAL OF CHEMICAL PHYSICS, 2024, 161 (24):
  • [2] Stochastic Variance-Reduced Policy Gradient
    Papini, Matteo
    Binaghi, Damiano
    Canonaco, Giuseppe
    Pirotta, Matteo
    Restelli, Marcello
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [3] Sampling from Non-Log-Concave Distributions via Stochastic Variance-Reduced Gradient Langevin Dynamics
    Zou, Difan
    Xu, Pan
    Gu, Quanquan
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [4] Accelerating variance-reduced stochastic gradient methods
    Derek Driggs
    Matthias J. Ehrhardt
    Carola-Bibiane Schönlieb
    Mathematical Programming, 2022, 191 : 671 - 715
  • [5] Accelerating variance-reduced stochastic gradient methods
    Driggs, Derek
    Ehrhardt, Matthias J.
    Schonlieb, Carola-Bibiane
    MATHEMATICAL PROGRAMMING, 2022, 191 (02) : 671 - 715
  • [6] Variance-Reduced Stochastic Gradient Descent on Streaming Data
    Jothimurugesan, Ellango
    Tahmasbi, Ashraf
    Gibbons, Phillip B.
    Tirthapura, Srikanta
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [7] Approximation to Stochastic Variance Reduced Gradient Langevin Dynamics by Stochastic Delay Differential Equations
    Chen, Peng
    Lu, Jianya
    Xu, Lihu
    APPLIED MATHEMATICS AND OPTIMIZATION, 2022, 85 (02):
  • [8] Approximation to Stochastic Variance Reduced Gradient Langevin Dynamics by Stochastic Delay Differential Equations
    Peng Chen
    Jianya Lu
    Lihu Xu
    Applied Mathematics & Optimization, 2022, 85
  • [9] Stochastic Gradient Langevin Dynamics with Variance Reduction
    Huang, Zhishen
    Becker, Stephen
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [10] Variance Reduction in Stochastic Gradient Langevin Dynamics
    Dubey, Avinava
    Reddi, Sashank J.
    Poczos, Barnabas
    Smola, Alexander J.
    Xing, Eric P.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29