Retraction-based first-order feasible methods for difference-of-convex programs with smooth inequality and simple geometric constraints
被引:1
|
作者:
Zhang, Yongle
论文数: 0引用数: 0
h-index: 0
机构:
Sichuan Normal Univ, Dept Math, Visual Comp & Virtual Real Key Lab Sichuan Prov, Chengdu, Peoples R ChinaSichuan Normal Univ, Dept Math, Visual Comp & Virtual Real Key Lab Sichuan Prov, Chengdu, Peoples R China
Zhang, Yongle
[1
]
Li, Guoyin
论文数: 0引用数: 0
h-index: 0
机构:
Univ New South Wales, Dept Appl Math, Sydney, AustraliaSichuan Normal Univ, Dept Math, Visual Comp & Virtual Real Key Lab Sichuan Prov, Chengdu, Peoples R China
Li, Guoyin
[2
]
Pong, Ting Kei
论文数: 0引用数: 0
h-index: 0
机构:
Hong Kong Polytech Univ, Dept Appl Math, Hong Kong, Peoples R ChinaSichuan Normal Univ, Dept Math, Visual Comp & Virtual Real Key Lab Sichuan Prov, Chengdu, Peoples R China
Pong, Ting Kei
[3
]
Xu, Shiqi
论文数: 0引用数: 0
h-index: 0
机构:
Sichuan Normal Univ, Dept Math, Chengdu, Peoples R ChinaSichuan Normal Univ, Dept Math, Visual Comp & Virtual Real Key Lab Sichuan Prov, Chengdu, Peoples R China
Xu, Shiqi
[4
]
机构:
[1] Sichuan Normal Univ, Dept Math, Visual Comp & Virtual Real Key Lab Sichuan Prov, Chengdu, Peoples R China
[2] Univ New South Wales, Dept Appl Math, Sydney, Australia
[3] Hong Kong Polytech Univ, Dept Appl Math, Hong Kong, Peoples R China
[4] Sichuan Normal Univ, Dept Math, Chengdu, Peoples R China
In this paper, we propose first-order feasible methods for difference-of-convex (DC) programs with smooth inequality and simple geometric constraints. Our strategy for maintaining feasibility of the iterates is based on a "retraction" idea adapted from the literature of manifold optimization. When the constraints are convex, we establish the global subsequential convergence of the sequence generated by our algorithm under strict feasibility condition, and analyze its convergence rate when the objective is in addition convex according to the Kurdyka-Lojasiewicz (KL) exponent of the extended objective (i.e., sum of the objective and the indicator function of the constraint set). We also show that the extended objective of a large class of Euclidean norm (and more generally, group LASSO penalty) regularized convex optimization problems is a KL function with exponent 1/2; consequently, our algorithm is locally linearly convergent when applied to these problems. We then extend our method to solve DC programs with a single specially structured nonconvex constraint. Finally, we discuss how our algorithms can be applied to solve two concrete optimization problems, namely, group-structured compressed sensing problems with Gaussian measurement noise and compressed sensing problems with Cauchy measurement noise, and illustrate the empirical performance of our algorithms.