A new inexact stochastic recursive gradient descent algorithm with Barzilai-Borwein step size in machine learning

被引:1
|
作者
Yang, Yi-ming [1 ]
Wang, Fu-sheng [1 ]
Li, Jin-xiang [1 ]
Qin, Yuan-yuan [1 ]
机构
[1] Taiyuan Normal Univ, Sch Math & Stat, Jinzhong 030619, Shanxi, Peoples R China
关键词
Stochastic optimization; Stochastic gradient; Variance reduction; BB method;
D O I
10.1007/s11071-022-07987-2
中图分类号
TH [机械、仪表工业];
学科分类号
0802 ;
摘要
The inexact SARAH (iSARAH) algorithm as a variant of SARAH algorithm for variance reduction has recently surged into prominence for solving large-scale optimization problems in the context of machine learning. The performance of the iSARAH significantly depends on the choice of step-size sequence. In this paper, we develop a new algorithm called iSARAH-BB, which employs the Barzilai-Borwein (BB) method to automatically compute step size based on SARAH. By introducing this adaptive step size in the design of the new algorithm, iSARAH-BB can take better advantages of both iSARAH and BB methods. Finally, we analyze the convergence rate and the complexity of the new algorithm under the usual assumptions. Numerical experiments on standard datasets indicate that our proposed iSARAH-BB algorithm is robust to the selection of the initial step size, and it is effective and more competitive than the existing algorithms.
引用
收藏
页码:3575 / 3586
页数:12
相关论文
共 9 条
  • [1] A new inexact stochastic recursive gradient descent algorithm with Barzilai–Borwein step size in machine learning
    Yi-ming Yang
    Fu-sheng Wang
    Jin-xiang Li
    Yuan-yuan Qin
    Nonlinear Dynamics, 2023, 111 : 3575 - 3586
  • [2] A Minibatch Proximal Stochastic Recursive Gradient Algorithm Using a Trust-Region-Like Scheme and Barzilai-Borwein Stepsizes
    Yu, Tengteng
    Liu, Xin-Wei
    Dai, Yu-Hong
    Sun, Jie
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (10) : 4627 - 4638
  • [3] Random Barzilai-Borwein step size for mini-batch algorithms
    Yang, Zhuang
    Wang, Cheng
    Zhang, Zhemin
    Li, Jonathan
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2018, 72 : 124 - 135
  • [4] Accelerated stochastic gradient descent with step size selection rules
    Yang, Zhuang
    Wang, Cheng
    Zhang, Zhemin
    Li, Jonathan
    SIGNAL PROCESSING, 2019, 159 : 171 - 186
  • [5] Stochastic Gradient Descent and Its Variants in Machine Learning
    Netrapalli, Praneeth
    JOURNAL OF THE INDIAN INSTITUTE OF SCIENCE, 2019, 99 (02) : 201 - 213
  • [6] SARAH-M: A fast stochastic recursive gradient descent algorithm via momentum
    Yang, Zhuang
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238
  • [7] Stochastic Gradient Descent and Its Variants in Machine Learning
    Praneeth Netrapalli
    Journal of the Indian Institute of Science, 2019, 99 : 201 - 213
  • [8] A stochastic recursive gradient algorithm integrating momentum and the powerball function with adaptive step sizes
    Qin, Chuandong
    Cai, Zilin
    Guo, Yuhang
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2025,
  • [9] Large-scale machine learning with synchronous parallel adaptive stochastic variance reduction gradient descent for high-dimensional blindness detection on spark
    Qin, Chuandong
    Zhang, Yiqing
    Cao, Yu
    JOURNAL OF SUPERCOMPUTING, 2025, 81 (04)