Kurdyka-Lojasiewicz Exponent via Inf-projection

被引:33
作者
Yu, Peiran [1 ]
Li, Guoyin [2 ]
Pong, Ting Kei [1 ]
机构
[1] Hong Kong Polytech Univ, Dept Appl Math, Hong Kong, Peoples R China
[2] Univ New South Wales, Dept Appl Math, Sydney, Australia
基金
澳大利亚研究理事会;
关键词
First-order methods; Convergence rate; Kurdyka-Lojasiewicz inequality; Kurdyka-Lojasiewicz exponent; Inf-projection; ERROR-BOUNDS; DESCENT METHODS; OPTIMIZATION; MINIMIZATION; CONVERGENCE; ALGORITHMS; REGULARITY; NONCONVEX;
D O I
10.1007/s10208-021-09528-6
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Kurdyka-Lojasiewicz (KL) exponent plays an important role in estimating the convergence rate of many contemporary first-order methods. In particular, a KL exponent of 1/2 for a suitable potential function is related to local linear convergence. Nevertheless, KL exponent is in general extremely hard to estimate. In this paper, we show under mild assumptions that KL exponent is preserved via inf-projection. Inf-projection is a fundamental operation that is ubiquitous when reformulating optimization problems via the lift-and-project approach. By studying its operation on KL exponent, we show that the KL exponent is 1/2 for several important convex optimization models, including some semidefinite-programming-representable functions and some functions that involve C-2-cone reducible structures, under conditions such as strict complementarity. Our results are applicable to concrete optimization models such as group-fused Lasso and overlapping group Lasso. In addition, for nonconvex models, we show that the KL exponent of many difference-of-convex functions can be derived from that of their natural majorant functions, and the KL exponent of the Bregman envelope of a function is the same as that of the function itself. Finally, we estimate the KL exponent of the sum of the least squares function and the indicator function of the set of matrices of rank at most k.
引用
收藏
页码:1171 / 1217
页数:47
相关论文
共 60 条
[1]   DIFFERENCE-OF-CONVEX LEARNING: DIRECTIONAL STATIONARITY, OPTIMALITY, AND SPARSITY [J].
Ahn, Miju ;
Pang, Jong-Shi ;
Xin, Jack .
SIAM JOURNAL ON OPTIMIZATION, 2017, 27 (03) :1637-1665
[2]  
Alaíz CM, 2013, LECT NOTES COMPUT SC, V8131, P66, DOI 10.1007/978-3-642-40728-4_9
[3]  
Artacho FJA, 2008, J CONVEX ANAL, V15, P365
[4]   On the convergence of the proximal algorithm for nonsmooth functions involving analytic features [J].
Attouch, Hedy ;
Bolte, Jerome .
MATHEMATICAL PROGRAMMING, 2009, 116 (1-2) :5-16
[5]   Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods [J].
Attouch, Hedy ;
Bolte, Jerome ;
Svaiter, Benar Fux .
MATHEMATICAL PROGRAMMING, 2013, 137 (1-2) :91-129
[6]   Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Lojasiewicz Inequality [J].
Attouch, Hedy ;
Bolte, Jerome ;
Redont, Patrick ;
Soubeyran, Antoine .
MATHEMATICS OF OPERATIONS RESEARCH, 2010, 35 (02) :438-457
[7]  
Auslender A., 2003, SPRINGER MG MATH
[8]  
Bauschke H.H., 2006, Pac. J. Optim., V2, P401
[9]   Strong conical hull intersection property, bounded linear regularity, Jameson's property (G), and error bounds in convex optimization [J].
Bauschke, HH ;
Borwein, JM ;
Li, W .
MATHEMATICAL PROGRAMMING, 1999, 86 (01) :135-160
[10]   Projection algorithms for solving convex feasibility problems [J].
Bauschke, HH ;
Borwein, JM .
SIAM REVIEW, 1996, 38 (03) :367-426