Functional Central Limit Theorem and Strong Law of Large Numbers for Stochastic Gradient Langevin Dynamics

被引:0
|
作者
A. Lovas
M. Rásonyi
机构
[1] Alfréd Rényi Institute of Mathematics: Renyi Alfred Matematikai Kutatointezet,
[2] Budapest University of Technology and Economics: Budapesti Műszaki és Gazdaságtudományi Egyetem,undefined
[3] Eotvos Lorand University: Eötvös Loránd Tudományegyetem,undefined
来源
Applied Mathematics & Optimization | 2023年 / 88卷
关键词
Stochastic gradient descent; Online learning; Functional central limit theorem; Mixing; Markov chains in random environments;
D O I
暂无
中图分类号
学科分类号
摘要
We study the mixing properties of an important optimization algorithm of machine learning: the stochastic gradient Langevin dynamics (SGLD) with a fixed step size. The data stream is not assumed to be independent hence the SGLD is not a Markov chain, merely a Markov chain in a random environment, which complicates the mathematical treatment considerably. We derive a strong law of large numbers and a functional central limit theorem for SGLD.
引用
收藏
相关论文
共 50 条