Restart of Accelerated First-Order Methods With Linear Convergence Under a Quadratic Functional Growth Condition

被引:1
作者
Alamo, Teodoro [1 ]
Krupa, Pablo [1 ]
Limon, Daniel [1 ]
机构
[1] Univ Seville, Dept Syst Engn & Automat, Seville 41092, Spain
关键词
Accelerated first-order methods (AFOMs); convex optimization; linear convergence; restart schemes; COMPUTATIONAL-COMPLEXITY; GRADIENT METHODS; ALGORITHM;
D O I
10.1109/TAC.2022.3146054
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Accelerated first-order methods, also called fast gradient methods, are popular optimization methods in the field of convex optimization. However, they are prone to suffer from oscillatory behavior that slows their convergence when medium to high accuracy is desired. In order to address this, restart schemes have been proposed in the literature, which seek to improve the practical convergence by suppressing the oscillatory behavior. This article presents a restart scheme for accelerated first-order methods, for which we show linear convergence under the satisfaction of a quadratic functional growth condition, thus encompassing a broad class of non-necessarily strongly convex optimization problems. Moreover, the worst-case convergence rate is comparable to the one obtained using an (generally nonimplementable) optimal fixed-rate restart strategy. We compare the proposed algorithm with other restart schemes by applying them to a model predictive control case study.
引用
收藏
页码:612 / 619
页数:8
相关论文
共 30 条
  • [1] Alamir M, 2013, 2013 EUROPEAN CONTROL CONFERENCE (ECC), P3621
  • [2] Alamo T, 2019, IEEE DECIS CONTR P, P3936, DOI 10.1109/CDC40024.2019.9029983
  • [3] Alamo T, 2019, 2019 18TH EUROPEAN CONTROL CONFERENCE (ECC), P1969, DOI [10.23919/ecc.2019.8795831, 10.23919/ECC.2019.8795831]
  • [4] Beck A, 2017, MOS-SIAM SER OPTIMIZ, P1, DOI 10.1137/1.9781611974997
  • [5] A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
    Beck, Amir
    Teboulle, Marc
    [J]. SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01): : 183 - 202
  • [6] Distributed optimization and statistical learning via the alternating direction method of multipliers
    Boyd S.
    Parikh N.
    Chu E.
    Peleato B.
    Eckstein J.
    [J]. Foundations and Trends in Machine Learning, 2010, 3 (01): : 1 - 122
  • [7] Sparse Identification of Nonlinear Dynamics with Control (SINDYc)
    Brunton, Steven L.
    Proctor, Joshua L.
    Kutz, J. Nathan
    [J]. IFAC PAPERSONLINE, 2016, 49 (18): : 710 - 715
  • [8] Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
    Drusvyatskiy, Dmitriy
    Lewis, Adrian S.
    [J]. MATHEMATICS OF OPERATIONS RESEARCH, 2018, 43 (03) : 919 - 948
  • [9] Adaptive restart of accelerated gradient methods under local quadratic growth condition
    Fercoq, Olivier
    Qu, Zheng
    [J]. IMA JOURNAL OF NUMERICAL ANALYSIS, 2019, 39 (04) : 2069 - 2095
  • [10] Giselsson P, 2014, IEEE DECIS CONTR P, P5058, DOI 10.1109/CDC.2014.7040179