An Accelerated Stochastic Mirror Descent Method

被引:0
|
作者
Jiang, Bo-Ou [1 ,2 ]
Yuan, Ya-Xiang [1 ]
机构
[1] Chinese Acad Sci, LSEC, ICMSEC, AMSS, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Dept Math, Beijing 100049, Peoples R China
基金
中国国家自然科学基金;
关键词
Large-scale optimization; Variance reduction; Mirror descent; Acceleration; Independent sampling; Importance sampling; THRESHOLDING ALGORITHM; OPTIMIZATION;
D O I
10.1007/s40305-023-00492-2
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Driven by large-scale optimization problems arising from machine learning, the development of stochastic optimization methods has witnessed a huge growth. Numerous types of methods have been developed based on vanilla stochastic gradient descent method. However, for most algorithms, convergence rate in stochastic setting cannot simply match that in deterministic setting. Better understanding the gap between deterministic and stochastic optimization is the main goal of this paper. Specifically, we are interested in Nesterov acceleration of gradient-based approaches. In our study, we focus on acceleration of stochastic mirror descent method with implicit regularization property. Assuming that the problem objective is smooth and convex or strongly convex, our analysis prescribes the method parameters which ensure fast convergence of the estimation error and satisfied numerical performance.
引用
收藏
页码:549 / 571
页数:23
相关论文
共 50 条
  • [1] Validation analysis of mirror descent stochastic approximation method
    Guanghui Lan
    Arkadi Nemirovski
    Alexander Shapiro
    Mathematical Programming, 2012, 134 : 425 - 458
  • [2] Validation analysis of mirror descent stochastic approximation method
    Lan, Guanghui
    Nemirovski, Arkadi
    Shapiro, Alexander
    MATHEMATICAL PROGRAMMING, 2012, 134 (02) : 425 - 458
  • [3] Algorithms of Robust Stochastic Optimization Based on Mirror Descent Method
    Nazin, A., V
    Nemirovsky, A. S.
    Tsybakov, A. B.
    Juditsky, A. B.
    AUTOMATION AND REMOTE CONTROL, 2019, 80 (09) : 1607 - 1627
  • [4] Algorithms of Robust Stochastic Optimization Based on Mirror Descent Method
    A. V. Nazin
    A. S. Nemirovsky
    A. B. Tsybakov
    A. B. Juditsky
    Automation and Remote Control, 2019, 80 : 1607 - 1627
  • [5] Continuous and Discrete-time Accelerated Stochastic Mirror Descent for Strongly Convex Functions
    Xu, Pan
    Wang, Tianhao
    Gu, Quanquan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [6] Momentum-based accelerated mirror descent stochastic approximation for robust topology optimization under stochastic loads
    Li, Weichen
    Zhang, Xiaojia Shelly
    INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, 2021, 122 (17) : 4431 - 4457
  • [7] Primal–Dual Mirror Descent Method for Constraint Stochastic Optimization Problems
    A. S. Bayandina
    A. V. Gasnikov
    E. V. Gasnikova
    S. V. Matsievskii
    Computational Mathematics and Mathematical Physics, 2018, 58 : 1728 - 1736
  • [8] Stochastic mirror descent method for distributed multi-agent optimization
    Jueyou Li
    Guoquan Li
    Zhiyou Wu
    Changzhi Wu
    Optimization Letters, 2018, 12 : 1179 - 1197
  • [9] Stochastic mirror descent method for distributed multi-agent optimization
    Li, Jueyou
    Li, Guoquan
    Wu, Zhiyou
    Wu, Changzhi
    OPTIMIZATION LETTERS, 2018, 12 (06) : 1179 - 1197
  • [10] Distributed mirror descent method with operator extrapolation for stochastic aggregative games
    Wang, Tongyu
    Yi, Peng
    Chen, Jie
    AUTOMATICA, 2024, 159