Performance of noisy Nesterov's accelerated method for strongly convex optimization problems

被引:9
|
作者
Mohammadi, Hesameddin [1 ]
Razaviyayn, Meisam [2 ]
Jovanovic, Mihailo R. [1 ]
机构
[1] Univ Southern Calif, Dept Elect & Comp Engn, Los Angeles, CA 90089 USA
[2] Univ Southern Calif, Dept Ind & Syst Engn, Los Angeles, CA 90089 USA
来源
2019 AMERICAN CONTROL CONFERENCE (ACC) | 2019年
基金
美国国家科学基金会;
关键词
Accelerated first-order algorithms; control for optimization; convex optimization; integral quadratic constraints; linear matrix inequalities; Nesterov's method; noise amplification; second-order moments; semidefinite programming; GRADIENT; ALGORITHMS;
D O I
10.23919/acc.2019.8814680
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study the performance of noisy gradient descent and Nesterov's accelerated methods for strongly convex objective functions with Lipschitz continuous gradients. The steady-state second -order moment of the error in the iterates is analyzed when the gradient is perturbed by an additive white noise with zero mean and identity covariance. For any given condition number kappa, we derive explicit upper bounds on noise amplification that only depend on kappa and the problem size. We use quadratic objective functions to derive lower bounds and to demonstrate that the upper bounds are tight up to a constant factor. The established upper bound for Nesterov's accelerated method is larger than the upper bound for gradient descent by a factor of root kappa. This gap identifies a fundamental tradeoff that comes with acceleration in the presence of stochastic uncertainties in the gradient evaluation.
引用
收藏
页码:3426 / 3431
页数:6
相关论文
共 50 条
  • [1] On the transient growth of Nesterov's accelerated method for strongly convex optimization problems
    Samuelson, Samantha
    Mohammadi, Hesameddin
    Jovanovic, Mihailo R.
    2020 59TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2020, : 5911 - 5916
  • [2] Robustness of Accelerated First-Order Algorithms for Strongly Convex Optimization Problems
    Mohammadi, Hesameddin
    Razaviyayn, Meisam
    Jovanovic, Mihailo R.
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2021, 66 (06) : 2480 - 2495
  • [3] Performance of noisy higher-order accelerated gradient flow dynamics for strongly convex quadratic optimization problems
    Samuelson, Samantha
    Mohammadi, Hesameddin
    Jovanovic, Mihailo R.
    2023 AMERICAN CONTROL CONFERENCE, ACC, 2023, : 3839 - 3844
  • [4] Nesterov's Method for Convex Optimization
    Walkington, Noel J.
    SIAM REVIEW, 2023, 65 (02) : 539 - 562
  • [5] Nesterov Accelerated Shuffling Gradient Method for Convex Optimization
    Tran, Trang H.
    Scheinberg, Katya
    Nguyen, Lam M.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [6] Performance of noisy three-step accelerated first-order optimization algorithms for strongly convex quadratic problems
    Samuelson, Samantha
    Mohammadi, Hesameddin
    Jovanovic, Mihailo R.
    2023 62ND IEEE CONFERENCE ON DECISION AND CONTROL, CDC, 2023, : 1300 - 1305
  • [7] A Universal Accelerated Primal-Dual Method for Convex Optimization Problems
    Luo, Hao
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2024, 201 (01) : 280 - 312
  • [8] A Universal Accelerated Primal–Dual Method for Convex Optimization Problems
    Hao Luo
    Journal of Optimization Theory and Applications, 2024, 201 : 280 - 312
  • [9] Variance amplification of accelerated first-order algorithms for strongly convex quadratic optimization problems
    Mohammadi, Hesameddin
    Razaviyayn, Meisam
    Jovanovic, Mihailo R.
    2018 IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2018, : 5753 - 5758
  • [10] The Log-Exponential Smoothing Technique and Nesterov's Accelerated Gradient Method for Generalized Sylvester Problems
    Nguyen Thai An
    Giles, Daniel
    Nguyen Mau Nam
    Rector, R. Blake
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2016, 168 (02) : 559 - 583