The Boosted Double-proximal Subgradient Algorithm for nonconvex optimization

被引:0
作者
Aragon-Artacho, Francisco J. [1 ]
Perez-Aros, Pedro [2 ]
Torregrosa-Belen, David [3 ]
机构
[1] Univ Alicante, Dept Math, Alicante, Spain
[2] Univ Chile, Ctr Modelamiento Matemat, Dept Ingn Matemat, CNRS IRL2807, Beauchef 851, Santiago, Chile
[3] Univ Chile, Ctr Modelamiento Matemat CNRS IRL2807, Beauchef 851, Santiago, Chile
关键词
Nonconvex optimization; Difference programming; Kurdyka-& Lstrok; ojasiewicz property; Descent lemma; Minimum-sum of squares clustering; Heron's problem; POINT ALGORITHM; DESCENT METHODS; CONVERGENCE; DIFFERENCE; MINIMIZATION; SUM;
D O I
10.1007/s10107-024-02190-0
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
In this paper we introduce the Boosted Double-proximal Subgradient Algorithm (BDSA), a novel splitting algorithm designed to address general structured nonsmooth and nonconvex mathematical programs expressed as sums and differences of composite functions. BDSA exploits the combined nature of subgradients from the data and proximal steps, and integrates a linesearch procedure to enhance its performance. While BDSA encompasses existing schemes proposed in the literature, it extends its applicability to more diverse problem domains. We establish the convergence of BDSA under the Kurdyka-& Lstrok;ojasiewicz property and provide an analysis of its convergence rate. To evaluate the effectiveness of BDSA, we introduce two novel test functions with an abundance of critical points. We conduct comparative evaluations, including algorithms with inertial terms, that illustrate its ability to effectively escape non-optimal critical points. Additionally, we present two practical applications of BDSA for testing its efficacy, namely, a constrained minimum-sum-of-squares clustering problem and a nonconvex generalization of Heron's problem.
引用
收藏
页数:47
相关论文
共 48 条
  • [1] On the Subdifferentiability of the Difference of Two Functions and Local Minimization
    Amahroq, Tijani
    Penot, Jean-Paul
    Syam, Aicha
    [J]. SET-VALUED ANALYSIS, 2008, 16 (04): : 413 - 427
  • [2] THE BOOSTED DIFFERENCE OF CONVEX FUNCTIONS ALGORITHM FOR NONSMOOTH FUNCTIONS
    Aragon Artacho, Francisco J.
    Vuong, Phan T.
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2020, 30 (01) : 980 - 1006
  • [3] Accelerating the DC algorithm for smooth functions
    Aragon Artacho, Francisco J.
    Fleming, Ronan M. T.
    Vuong, Phan T.
    [J]. MATHEMATICAL PROGRAMMING, 2018, 169 (01) : 95 - 118
  • [4] Aragon F., 2019, Nonlinear Optimization, DOI DOI 10.1007/978-3-030-11184-7
  • [5] The Boosted DC Algorithm for Linearly Constrained DC Programming
    Aragon-Artacho, F. J.
    Campoy, R.
    Vuong, P. T.
    [J]. SET-VALUED AND VARIATIONAL ANALYSIS, 2022, 30 (04) : 1265 - 1289
  • [6] Coderivative-based semi-Newton method in nonsmooth difference programming
    Aragon-Artacho, Francisco J.
    Mordukhovich, Boris S.
    Perez-Aros, Pedro
    [J]. MATHEMATICAL PROGRAMMING, 2024,
  • [8] On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
    Attouch, Hedy
    Bolte, Jerome
    [J]. MATHEMATICAL PROGRAMMING, 2009, 116 (1-2) : 5 - 16
  • [9] Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
    Attouch, Hedy
    Bolte, Jerome
    Svaiter, Benar Fux
    [J]. MATHEMATICAL PROGRAMMING, 2013, 137 (1-2) : 91 - 129
  • [10] A general double-proximal gradient algorithm for d.c. programming
    Banert, Sebastian
    Bot, Radu Ioan
    [J]. MATHEMATICAL PROGRAMMING, 2019, 178 (1-2) : 301 - 326