An Accelerated Stochastic Mirror Descent Method

被引:0
|
作者
Jiang, Bo-Ou [1 ,2 ]
Yuan, Ya-Xiang [1 ]
机构
[1] Chinese Acad Sci, LSEC, ICMSEC, AMSS, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Dept Math, Beijing 100049, Peoples R China
基金
中国国家自然科学基金;
关键词
Large-scale optimization; Variance reduction; Mirror descent; Acceleration; Independent sampling; Importance sampling; THRESHOLDING ALGORITHM; OPTIMIZATION;
D O I
10.1007/s40305-023-00492-2
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Driven by large-scale optimization problems arising from machine learning, the development of stochastic optimization methods has witnessed a huge growth. Numerous types of methods have been developed based on vanilla stochastic gradient descent method. However, for most algorithms, convergence rate in stochastic setting cannot simply match that in deterministic setting. Better understanding the gap between deterministic and stochastic optimization is the main goal of this paper. Specifically, we are interested in Nesterov acceleration of gradient-based approaches. In our study, we focus on acceleration of stochastic mirror descent method with implicit regularization property. Assuming that the problem objective is smooth and convex or strongly convex, our analysis prescribes the method parameters which ensure fast convergence of the estimation error and satisfied numerical performance.
引用
收藏
页码:549 / 571
页数:23
相关论文
共 50 条
  • [31] An Efficient Approach of GPU-accelerated Stochastic Gradient Descent Method for Matrix Factorization
    Li, Feng
    Ye, Yunming
    Li, Xutao
    JOURNAL OF INTERNET TECHNOLOGY, 2019, 20 (04): : 1087 - 1097
  • [32] A CHARACTERIZATION OF STOCHASTIC MIRROR DESCENT ALGORITHMS AND THEIR CONVERGENCE PROPERTIES
    Azizan, Navid
    Hassibi, Babak
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 5167 - 5171
  • [33] Primal-Dual Stochastic Mirror Descent for MDPs
    Tiapkin, Daniil
    Gasnikov, Alexander
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [34] Stochastic Mirror Descent in Variationally Coherent Optimization Problems
    Zhou, Zhengyuan
    Mertikopoulos, Panayotis
    Bambos, Nicholas
    Boyd, Stephen
    Glynn, Peter
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [35] Stochastic incremental mirror descent algorithms with Nesterov smoothing
    Sandy Bitterlich
    Sorin-Mihai Grad
    Numerical Algorithms, 2024, 95 : 351 - 382
  • [36] Variance reduction on general adaptive stochastic mirror descent
    Li, Wenjie
    Wang, Zhanyu
    Zhang, Yichen
    Cheng, Guang
    MACHINE LEARNING, 2022, 111 (12) : 4639 - 4677
  • [37] Variance reduction on general adaptive stochastic mirror descent
    Wenjie Li
    Zhanyu Wang
    Yichen Zhang
    Guang Cheng
    Machine Learning, 2022, 111 : 4639 - 4677
  • [38] Accelerated Stochastic Block Coordinate Descent with Optimal Sampling
    Zhang, Aston
    Gu, Quanquan
    KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, : 2035 - 2044
  • [39] Accelerated and Unaccelerated Stochastic Gradient Descent in Model Generality
    D. M. Dvinskikh
    A. I. Tyurin
    A. V. Gasnikov
    C. C. Omel’chenko
    Mathematical Notes, 2020, 108 : 511 - 522
  • [40] Accelerated Stochastic Gradient Descent for Minimizing Finite Sums
    Nitanda, Atsushi
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 195 - 203