Accelerating inexact successive quadratic approximation for regularized optimization through manifold identification

被引:4
作者
Lee, Ching-pei [1 ]
机构
[1] Acad Sinica, Taipei, Taiwan
关键词
Variable metric; Manifold identification; Regularized optimization; Inexact method; Superlinear convergence; QUASI-NEWTON METHODS; PARTIAL SMOOTHNESS; CONVERGENCE; MINIMIZATION; ALGORITHMS; COMPLEXITY; MATRIX; SETS;
D O I
10.1007/s10107-022-01916-2
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
For regularized optimization that minimizes the sum of a smooth term and a regularizer that promotes structured solutions, inexact proximal-Newton-type methods, or successive quadratic approximation (SQA) methods, are widely used for their superlinear convergence in terms of iterations. However, unlike the counter parts in smooth optimization, they suffer from lengthy running time in solving regularized subproblems because even approximate solutions cannot be computed easily, so their empirical time cost is not as impressive. In this work, we first show that for partly smooth regularizers, although general inexact solutions cannot identify the active manifold that makes the objective function smooth, approximate solutions generated by commonly used subproblem solvers will identify this manifold, even with arbitrarily low solution precision. We then utilize this property to propose an improved SQA method, ISQA(+), that switches to efficient smooth optimization methods after this manifold is identified. We show that for a wide class of degenerate solutions, ISQA(+) possesses superlinear convergence not only in iterations, but also in running time because the cost per iteration is bounded. In particular, our superlinear convergence result holds on problems satisfying a sharpness condition that is more general than that in existing literature. We also prove iterate convergence under a sharpness condition for inexact SQA, which is novel for this family of methods that could easily violate the classical relative-error condition frequently used in proving convergence under similar conditions. Experiments on real-world problems support that ISQA(+) improves running time over some modern solvers for regularized optimization.
引用
收藏
页码:599 / 633
页数:35
相关论文
共 54 条
  • [1] Absil PA, 2008, OPTIMIZATION ALGORITHMS ON MATRIX MANIFOLDS, P1
  • [2] Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Lojasiewicz Inequality
    Attouch, Hedy
    Bolte, Jerome
    Redont, Patrick
    Soubeyran, Antoine
    [J]. MATHEMATICS OF OPERATIONS RESEARCH, 2010, 35 (02) : 438 - 457
  • [3] From error bounds to the complexity of first-order descent methods for convex functions
    Bolte, Jerome
    Trong Phong Nguyen
    Peypouquet, Juan
    Suter, Bruce W.
    [J]. MATHEMATICAL PROGRAMMING, 2017, 165 (02) : 471 - 507
  • [4] CONVERGENCE OF INEXACT FORWARD-BACKWARD ALGORITHMS USING THE FORWARD-BACKWARD ENVELOPE
    Bonettini, S.
    Prato, M.
    Rebegoldi, S.
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2020, 30 (04) : 3069 - 3097
  • [5] On the convergence of a linesearch based proximal-gradient method for nonconvex optimization
    Bonettini, S.
    Loris, I.
    Porta, F.
    Prato, M.
    Rebegoldi, S.
    [J]. INVERSE PROBLEMS, 2017, 33 (05)
  • [6] WEAK SHARP MINIMA IN MATHEMATICAL-PROGRAMMING
    BURKE, JV
    FERRIS, MC
    [J]. SIAM JOURNAL ON CONTROL AND OPTIMIZATION, 1993, 31 (05) : 1340 - 1359
  • [7] An inexact successive quadratic approximation method for L-1 regularized optimization
    Byrd, Richard H.
    Nocedal, Jorge
    Oztoprak, Figen
    [J]. MATHEMATICAL PROGRAMMING, 2016, 157 (02) : 375 - 396
  • [8] Stable signal recovery from incomplete and inaccurate measurements
    Candes, Emmanuel J.
    Romberg, Justin K.
    Tao, Terence
    [J]. COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2006, 59 (08) : 1207 - 1223
  • [9] Exact Matrix Completion via Convex Optimization
    Candes, Emmanuel J.
    Recht, Benjamin
    [J]. FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2009, 9 (06) : 717 - 772
  • [10] Variable Metric Forward-Backward Algorithm for Minimizing the Sum of a Differentiable Function and a Convex Function
    Chouzenoux, Emilie
    Pesquet, Jean-Christophe
    Repetti, Audrey
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2014, 162 (01) : 107 - 132