Smoothing Accelerated Proximal Gradient Method with Fast Convergence Rate for Nonsmooth Convex Optimization Beyond Differentiability

被引:1
|
作者
Wu, Fan [1 ]
Bian, Wei [1 ,2 ]
机构
[1] Harbin Inst Technol, Sch Math, Harbin 150001, Peoples R China
[2] Harbin Inst Technol, Inst Adv Study Math, Harbin 150001, Peoples R China
基金
中国国家自然科学基金;
关键词
Nonsmooth optimization; Smoothing method; Accelerated algorithm with extrapolation; Convergence rate; Sequential convergence; MONOTONE-OPERATORS; WEAK-CONVERGENCE; ALGORITHM; MINIMIZATION;
D O I
10.1007/s10957-023-02176-6
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We propose a smoothing accelerated proximal gradient (SAPG) method with fast convergence rate for finding a minimizer of a decomposable nonsmooth convex function over a closed convex set. The proposed algorithm combines the smoothing method with the proximal gradient algorithm with extrapolation (k-1 )/(k+alpha -1 )and alpha > 3. The updating rule of smoothing parameter mu k is a smart scheme and guarantees the global convergence rate of o(ln(sigma) k/k) with sigma is an element of ((1)/(2), 1] on the objective function values. Moreover, we prove that the iterates sequence is convergent to an optimal solution of the problem. We then introduce an error term in the SAPG algorithm to get the inexact smoothing accelerated proximal gradient algorithm. And we obtain the same convergence results as the SAPG algorithm under the summability condition on the errors. Finally, numerical experiments show the effectiveness and efficiency of the proposed algorithm.
引用
收藏
页码:539 / 572
页数:34
相关论文
共 50 条
  • [1] Smoothing Accelerated Proximal Gradient Method with Fast Convergence Rate for Nonsmooth Convex Optimization Beyond Differentiability
    Fan Wu
    Wei Bian
    Journal of Optimization Theory and Applications, 2023, 197 : 539 - 572
  • [2] A feasible smoothing accelerated projected gradient method for nonsmooth convex optimization
    Nishioka, Akatsuki
    Kanno, Yoshihiro
    OPERATIONS RESEARCH LETTERS, 2024, 57
  • [3] Fast inertial dynamic algorithm with smoothing method for nonsmooth convex optimization
    Xin Qu
    Wei Bian
    Computational Optimization and Applications, 2022, 83 : 287 - 317
  • [4] Fast inertial dynamic algorithm with smoothing method for nonsmooth convex optimization
    Qu, Xin
    Bian, Wei
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2022, 83 (01) : 287 - 317
  • [5] A SMOOTHING PROXIMAL GRADIENT ALGORITHM FOR NONSMOOTH CONVEX REGRESSION WITH CARDINALITY PENALTY
    Bian, Wei
    Chen, Xiaojun
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2020, 58 (01) : 858 - 883
  • [6] Fast proximal algorithms for nonsmooth convex optimization
    Ouorou, Adam
    OPERATIONS RESEARCH LETTERS, 2020, 48 (06) : 777 - 783
  • [7] Rate of convergence analysis of a dual fast gradient method for general convex optimization
    Patrascu, Andrei
    Necoara, Ion
    Findeisen, Rolf
    2015 54TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2015, : 3311 - 3316
  • [8] MGPROX: A NONSMOOTH MULTIGRID PROXIMAL GRADIENT METHOD WITH ADAPTIVE RESTRICTION FOR STRONGLY CONVEX OPTIMIZATION
    Ang, Andersen
    de Sterck, Hans
    Vavasis, Stephen
    SIAM JOURNAL ON OPTIMIZATION, 2024, 34 (03) : 2788 - 2820
  • [9] A NOTE ON THE (ACCELERATED) PROXIMAL GRADIENT METHOD FOR COMPOSITE CONVEX OPTIMIZATION
    Li, Qingjing
    Tan, Li
    Guo, Ke
    JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2022, 23 (12) : 2847 - 2857
  • [10] A FAST DUAL GRADIENT METHOD FOR SEPARABLE CONVEX OPTIMIZATION VIA SMOOTHING
    Li, Jueyou
    Wu, Zhiyou
    Wu, Changzhi
    Long, Qiang
    Wang, Xiangyu
    Lee, Jae-Myung
    Jung, Kwang-Hyo
    PACIFIC JOURNAL OF OPTIMIZATION, 2016, 12 (02): : 289 - +