Adaptive Biased Stochastic Optimization

被引:0
|
作者
Yang, Zhuang [1 ]
机构
[1] Soochow Univ, Sch Comp Sci & Technol, Suzhou 215006, Peoples R China
基金
中国国家自然科学基金;
关键词
Stochastic processes; Optimization; Radio frequency; Convergence; Machine learning algorithms; Machine learning; Complexity theory; Numerical models; Adaptation models; Support vector machines; Stochastic optimization; biased gradient estimation; convergence analysis; numerical stability; adaptivity; CONJUGATE-GRADIENT METHOD; DESCENT;
D O I
10.1109/TPAMI.2025.3528193
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This work develops and analyzes a class of adaptive biased stochastic optimization (ABSO) algorithms from the perspective of the GEneralized Adaptive gRadient (GEAR) method that contains Adam, AdaGrad, RMSProp, etc. Particularly, two preferred biased stochastic optimization (BSO) algorithms, the biased stochastic variance reduction gradient (BSVRG) algorithm and the stochastic recursive gradient algorithm (SARAH), equipped with GEAR, are first considered in this work, leading to two ABSO algorithms: BSVRG-GEAR and SARAH-GEAR. We present a uniform analysis of ABSO algorithms for minimizing strongly convex (SC) and Polyak-& Lstrok;ojasiewicz (P & Lstrok;) composite objective functions. Second, we also use our framework to develop another novel BSO algorithm, adaptive biased stochastic conjugate gradient (coined BSCG-GEAR), which achieves the well-known oracle complexity. Specifically, under mild conditions, we prove that the resulting ABSO algorithms attain a linear convergence rate on both P & Lstrok; and SC cases. Moreover, we show that the complexity of the resulting ABSO algorithms is comparable to that of advanced stochastic gradient-based algorithms. Finally, we demonstrate the empirical superiority and the numerical stability of the resulting ABSO algorithms by conducting numerical experiments on different applications of machine learning.
引用
收藏
页码:3067 / 3078
页数:12
相关论文
共 50 条
  • [1] Adaptive Stochastic Optimization: A Framework for Analyzing Stochastic Optimization Algorithms
    Curtis, Frank E.
    Scheinberg, Katya
    IEEE SIGNAL PROCESSING MAGAZINE, 2020, 37 (05) : 32 - 42
  • [2] Nonasymptotic Bounds for Stochastic Optimization With Biased Noisy Gradient Oracles
    Bhavsar, Nirav
    Prashanth, L. A.
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2023, 68 (03) : 1628 - 1641
  • [3] Stochastic Adaptive Optimization With Dithers
    Xie, Siyu
    Liang, Shu
    Wang, Le Yi
    Yin, George
    Chen, Wen
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2022, 67 (01) : 189 - 202
  • [4] On the Convergence of Decentralized Stochastic Gradient Descent With Biased Gradients
    Jiang, Yiming
    Kang, Helei
    Liu, Jinlan
    Xu, Dongpo
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2025, 73 : 549 - 558
  • [5] Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural Networks
    Hashem, Ibrahim Abaker Targio
    Alaba, Fadele Ayotunde
    Jumare, Muhammad Haruna
    Ibrahim, Ashraf Osman
    Abulfaraj, Anas Waleed
    IEEE ACCESS, 2024, 12 : 33757 - 33768
  • [6] Adaptive Evolution Strategies for Stochastic Zeroth-Order Optimization
    He, Xiaoyu
    Zheng, Zibin
    Chen, Zefeng
    Zhou, Yuren
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2022, 6 (05): : 1271 - 1285
  • [7] The Powerball Method With Biased Stochastic Gradient Estimation for Large-Scale Learning Systems
    Yang, Zhuang
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, : 7435 - 7447
  • [8] Balancing Rates and Variance via Adaptive Batch-Size for Stochastic Optimization Problems
    Gao, Zhan
    Koppel, Alec
    Ribeiro, Alejandro
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 3693 - 3708
  • [9] ε-Approximation of Adaptive Leaning Rate Optimization Algorithms for Constrained Nonconvex Stochastic Optimization
    Iiduka, Hideaki
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (10) : 8108 - 8115
  • [10] Solving Stochastic Compositional Optimization is Nearly as Easy as Solving Stochastic Optimization
    Chen, Tianyi
    Sun, Yuejiao
    Yin, Wotao
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 4937 - 4948