STOCHASTIC MULTILEVEL COMPOSITION OPTIMIZATION ALGORITHMS WITH LEVEL-INDEPENDENT CONVERGENCE RATES

被引:16
|
作者
Balasubramanian, Krishnakumar [1 ]
Ghadimi, Saeed [2 ]
Nguyen, Anthony [3 ]
机构
[1] Univ Calif Davis, Dept Stat, Davis, CA 95616 USA
[2] Univ Waterloo, Dept Management Sci, Waterloo, ON N2L 3G1, Canada
[3] Univ Calif Davis, Dept Math, Davis, CA 95616 USA
基金
加拿大自然科学与工程研究理事会;
关键词
multilevel stochastic composition; nonconvex optimization; level-independent convergence rate; complexity bounds; APPROXIMATION METHOD; GRADIENT DESCENT;
D O I
10.1137/21M1406222
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, we study smooth stochastic multilevel composition optimization problems, where the objective function is a nested composition of T functions. We assume access to noisy evaluations of the functions and their gradients, through a stochastic first-order oracle. For solving this class of problems, we propose two algorithms using moving-average stochastic estimates, and analyze their convergence to an epsilon-stationary point of the problem. We show that the first algorithm, which is a generalization of [S. Ghadimi, A. Ruszczynski, and M. Wang, SIAM T. Optim., 30 (2020), pp. 960-979] to the T level case, can achieve a sample complexity of O-T(1/epsilon(6)) by using minibatches of samples in each iteration, where O-T hides constants that depend on T. By modifying this algorithm using linearized stochastic estimates of the function values, we improve the sample complexity to O-T(1/epsilon(4)). This modification not only removes the requirement of having a minibatch of samples in each iteration, but also makes the algorithm parameter-free and easy to implement. To the best of our knowledge, this is the first time that such an online algorithm designed for the (un)constrained multilevel setting obtains the same sample complexity of the smooth single-level setting, under standard assumptions (unbiasedness and boundedness of the second moments) on the stochastic first-order oracle.
引用
收藏
页码:519 / 544
页数:26
相关论文
共 50 条
  • [21] THE EFFECT OF PERTURBATIONS ON THE CONVERGENCE-RATES OF OPTIMIZATION ALGORITHMS
    DUNN, JC
    SACHS, E
    APPLIED MATHEMATICS AND OPTIMIZATION, 1983, 10 (02): : 143 - 157
  • [22] Gradient algorithms for quadratic optimization with fast convergence rates
    Pronzato, Luc
    Zhigljavsky, Anatoly
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2011, 50 (03) : 597 - 617
  • [23] Gradient algorithms for quadratic optimization with fast convergence rates
    Luc Pronzato
    Anatoly Zhigljavsky
    Computational Optimization and Applications, 2011, 50 : 597 - 617
  • [24] User-level Differentially Private Stochastic Convex Optimization: Efficient Algorithms with Optimal Rates
    Asi, Hilal
    Liu, Daogao
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [25] A STOCHASTIC SUBGRADIENT METHOD FOR NONSMOOTH NONCONVEX MULTILEVEL COMPOSITION OPTIMIZATION
    Ruszczynski, Andrzej
    SIAM JOURNAL ON CONTROL AND OPTIMIZATION, 2021, 59 (03) : 2301 - 2320
  • [26] Rates of convergence of adaptive step-size of stochastic approximation algorithms
    Shao, S
    Yip, PPC
    JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2000, 244 (02) : 333 - 347
  • [27] Convergence rates of accelerated proximal gradient algorithms under independent noise
    Sun, Tao
    Barrio, Roberto
    Jiang, Hao
    Cheng, Lizhi
    NUMERICAL ALGORITHMS, 2019, 81 (02) : 631 - 654
  • [28] Convergence rates of accelerated proximal gradient algorithms under independent noise
    Tao Sun
    Roberto Barrio
    Hao Jiang
    Lizhi Cheng
    Numerical Algorithms, 2019, 81 : 631 - 654
  • [29] Convergence rates for distributed stochastic optimization over random networks
    Jakovetic, Dusan
    Bajovic, Dragana
    Sahu, Anit Kumar
    Kar, Soummya
    2018 IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2018, : 4238 - 4245
  • [30] New nonasymptotic convergence rates of stochastic proximal point algorithm for stochastic convex optimization
    Patrascu, Andrei
    OPTIMIZATION, 2021, 70 (09) : 1891 - 1919