New convergence results for the inexact variable metric forward-backward method

被引:9
作者
Bonettini, S. [1 ]
Prato, M. [1 ]
Rebegoldi, S. [2 ]
机构
[1] Univ Modena & Reggio Emilia, Dipartimento Sci Fis Informat & Matemat, Emilia Via Campi 213, I-41125 Modena, Italy
[2] Univ Firenze, Dipartimento Ingn Ind, Viale Morgagni 40, I-50143 Florence, Italy
关键词
Numerical optimization; Inexact forward-backward methods; Nonconvex problems; Kurdyka-ojasiewicz property; PROXIMAL ALGORITHM; NOISE REMOVAL; NONCONVEX; MINIMIZATION;
D O I
10.1016/j.amc.2020.125719
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Forward-backward methods are valid tools to solve a variety of optimization problems where the objective function is the sum of a smooth, possibly nonconvex term plus a convex, possibly nonsmooth function. The corresponding iteration is built on two main ingredients: the computation of the gradient of the smooth part and the evaluation of the proximity (or resolvent) operator associated with the convex term. One of the main difficulties, from both implementation and theoretical point of view, arises when the proximity operator is computed in an inexact way. The aim of this paper is to provide new convergence results about forward-backward methods with inexact computation of the proximity operator, under the assumption that the objective function satisfies the Kurdyka- ojasiewicz property. In particular, we adopt an inexactness criterion which can be implemented in practice, while preserving the main theoretical properties of the proximity operator. The main result is the proof of the convergence of the iterates generated by the forward-backward algorithm in Bonettini et al. (2017) to a stationary point. Convergence rate estimates are also provided. At the best of our knowledge, there exists no other inexact forward-backward algorithm with proved convergence in the nonconvex case and equipped with an explicit procedure to inexactly compute the proximity operator. (c) 2020 Elsevier Inc. All rights reserved.
引用
收藏
页数:21
相关论文
共 40 条
[1]  
[Anonymous], 1998, GRUND MATH WISS
[2]   On the convergence of the proximal algorithm for nonsmooth functions involving analytic features [J].
Attouch, Hedy ;
Bolte, Jerome .
MATHEMATICAL PROGRAMMING, 2009, 116 (1-2) :5-16
[3]   Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods [J].
Attouch, Hedy ;
Bolte, Jerome ;
Svaiter, Benar Fux .
MATHEMATICAL PROGRAMMING, 2013, 137 (1-2) :91-129
[4]   A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications [J].
Bauschke, Heinz H. ;
Bolte, Jerome ;
Teboulle, Marc .
MATHEMATICS OF OPERATIONS RESEARCH, 2017, 42 (02) :330-348
[5]  
Bauschke HH, 2011, CMS BOOKS MATH, P1, DOI 10.1007/978-1-4419-9467-7
[6]   A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems [J].
Beck, Amir ;
Teboulle, Marc .
SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01) :183-202
[7]  
Bertsekas D. P., 1999, NONLINEAR PROGRAMMIN
[8]   Inexact spectral projected gradient methods on convex sets [J].
Birgin, EG ;
Martínez, JM ;
Raydan, M .
IMA JOURNAL OF NUMERICAL ANALYSIS, 2003, 23 (04) :539-559
[9]   THE LOJASIEWICZ INEQUALITY FOR NONSMOOTH SUBANALYTIC FUNCTIONS WITH APPLICATIONS TO SUBGRADIENT DYNAMICAL SYSTEMS [J].
Bolte, Jerome ;
Daniilidis, Aris ;
Lewis, Adrian .
SIAM JOURNAL ON OPTIMIZATION, 2007, 17 (04) :1205-1223
[10]   Clarke subgradients of stratifiable functions [J].
Bolte, Jerome ;
Daniilidis, Aris ;
Lewis, Adrian ;
Shiota, Masahiro .
SIAM JOURNAL ON OPTIMIZATION, 2007, 18 (02) :556-572