Continuous and Discrete-time Accelerated Stochastic Mirror Descent for Strongly Convex Functions

被引:0
|
作者
Xu, Pan [1 ]
Wang, Tianhao [2 ]
Gu, Quanquan [1 ]
机构
[1] Univ Calif Los Angeles, Dept Comp Sci, Los Angeles, CA 90095 USA
[2] Univ Sci & Technol China, Sch Math Sci, Hefei, Anhui, Peoples R China
基金
美国国家科学基金会;
关键词
APPROXIMATION ALGORITHMS; COMPOSITE OPTIMIZATION; MINIMIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We provide a second-order stochastic differential equation (SDE), which characterizes the continuous-time dynamics of accelerated stochastic mirror descent (ASMD) for strongly convex functions. This SDE plays a central role in designing new discrete-time ASMD algorithms via numerical discretization and providing neat analyses of their convergence rates based on Lyapunov functions. Our results suggest that the only existing ASMD algorithm, namely, AC-SA proposed in Ghadimi & Lan (2012) is one instance of its kind, and we can derive new instances of ASMD with fewer tuning parameters. This sheds light on revisiting accelerated stochastic optimization through the lens of SDEs, which can lead to a better understanding as well as new simpler algorithms of acceleration in stochastic optimization. Numerical experiments on both synthetic and real data support our theory.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Accelerated Stochastic Mirror Descent: From Continuous-time Dynamics to Discrete-time Algorithms
    Xu, Pan
    Wang, Tianhao
    Gu, Quanquan
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84
  • [2] Accelerated Mirror Descent in Continuous and Discrete Time
    Krichene, Walid
    Bayen, Alexandre M.
    Bartlett, Peter L.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [3] Optimal distributed stochastic mirror descent for strongly convex optimization
    Yuan, Deming
    Hong, Yiguang
    Ho, Daniel W. C.
    Jiang, Guoping
    AUTOMATICA, 2018, 90 : 196 - 203
  • [4] Accelerated Distributed Nesterov Gradient Descent for Smooth and Strongly Convex Functions
    Qu, Guannan
    Li, Na
    2016 54TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2016, : 209 - 216
  • [5] Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization
    Le Thi Khanh Hien
    Cuong V. Nguyen
    Huan Xu
    Canyi Lu
    Jiashi Feng
    Journal of Optimization Theory and Applications, 2019, 181 : 541 - 566
  • [6] Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization
    Hien, Le Thi Khanh
    Nguyen, Cuong, V
    Xu, Huan
    Lu, Canyi
    Feng, Jiashi
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2019, 181 (02) : 541 - 566
  • [7] An Accelerated Stochastic Mirror Descent Method
    Jiang, Bo-Ou
    Yuan, Ya-Xiang
    JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2024, 12 (03) : 549 - 571
  • [8] Preliminary results on the existence of continuous Lyapunov functions for semicontinuous, stochastic discrete-time systems
    Teel, Andrew R.
    PROCEEDINGS OF THE 48TH IEEE CONFERENCE ON DECISION AND CONTROL, 2009 HELD JOINTLY WITH THE 2009 28TH CHINESE CONTROL CONFERENCE (CDC/CCC 2009), 2009, : 4729 - 4734
  • [9] Persistence of a continuous stochastic process with discrete-time sampling
    Majumdar, SN
    Bray, AJ
    Ehrhardt, GMCA
    PHYSICAL REVIEW E, 2001, 64 (01):
  • [10] ON THE CONVERGENCE OF MIRROR DESCENT BEYOND STOCHASTIC CONVEX PROGRAMMING
    Zhou, Zhengyuan
    Mertikopoulos, Panayotis
    Bambos, Nicholas
    Boyd, Stephen P.
    Glynn, Peter W.
    SIAM JOURNAL ON OPTIMIZATION, 2020, 30 (01) : 687 - 716