Retraction-based first-order feasible methods for difference-of-convex programs with smooth inequality and simple geometric constraints

被引:1
|
作者
Zhang, Yongle [1 ]
Li, Guoyin [2 ]
Pong, Ting Kei [3 ]
Xu, Shiqi [4 ]
机构
[1] Sichuan Normal Univ, Dept Math, Visual Comp & Virtual Real Key Lab Sichuan Prov, Chengdu, Peoples R China
[2] Univ New South Wales, Dept Appl Math, Sydney, Australia
[3] Hong Kong Polytech Univ, Dept Appl Math, Hong Kong, Peoples R China
[4] Sichuan Normal Univ, Dept Math, Chengdu, Peoples R China
基金
澳大利亚研究理事会; 中国国家自然科学基金;
关键词
First-order feasible methods; Retraction; Difference-of-convex optimization; Kurdyka-Lojasiewicz exponents; GLOBAL CONVERGENCE; ERROR-BOUNDS; MINIMIZATION; OPTIMIZATION; ALGORITHM; REGRESSION; NONCONVEX; PROPERTY; RECOVERY; DESCENT;
D O I
10.1007/s10444-022-10002-2
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, we propose first-order feasible methods for difference-of-convex (DC) programs with smooth inequality and simple geometric constraints. Our strategy for maintaining feasibility of the iterates is based on a "retraction" idea adapted from the literature of manifold optimization. When the constraints are convex, we establish the global subsequential convergence of the sequence generated by our algorithm under strict feasibility condition, and analyze its convergence rate when the objective is in addition convex according to the Kurdyka-Lojasiewicz (KL) exponent of the extended objective (i.e., sum of the objective and the indicator function of the constraint set). We also show that the extended objective of a large class of Euclidean norm (and more generally, group LASSO penalty) regularized convex optimization problems is a KL function with exponent 1/2; consequently, our algorithm is locally linearly convergent when applied to these problems. We then extend our method to solve DC programs with a single specially structured nonconvex constraint. Finally, we discuss how our algorithms can be applied to solve two concrete optimization problems, namely, group-structured compressed sensing problems with Gaussian measurement noise and compressed sensing problems with Cauchy measurement noise, and illustrate the empirical performance of our algorithms.
引用
收藏
页数:40
相关论文
共 4 条
  • [1] Retraction-based first-order feasible methods for difference-of-convex programs with smooth inequality and simple geometric constraints
    Yongle Zhang
    Guoyin Li
    Ting Kei Pong
    Shiqi Xu
    Advances in Computational Mathematics, 2023, 49
  • [2] Optimizing the Efficiency of First-Order Methods for Decreasing the Gradient of Smooth Convex Functions
    Kim, Donghwan
    Fessler, Jeffrey A.
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2021, 188 (01) : 192 - 219
  • [3] A first-order convergence analysis of trust-region methods with inexact Jacobians and inequality constraints
    Walther, Andrea
    Vetukuri, Sree Rama Raju
    Biegler, Lorenz T.
    OPTIMIZATION METHODS & SOFTWARE, 2012, 27 (02) : 373 - 389
  • [4] Smooth strongly convex interpolation and exact worst-case performance of first-order methods
    Taylor, Adrien B.
    Hendrickx, Julien M.
    Glineur, Francois
    MATHEMATICAL PROGRAMMING, 2017, 161 (1-2) : 307 - 345