On the Rate of Convergence of the Difference-of-Convex Algorithm (DCA)

被引:6
|
作者
Abbaszadehpeivasti, Hadi [1 ]
de Klerk, Etienne [1 ]
Zamani, Moslem [1 ]
机构
[1] Tilburg Univ, Dept Econometr & Operat Res, Tilburg, Netherlands
关键词
Convex-concave procedure; Difference-of-convex problems; Performance estimation; Worst-case convergence; Semidefinite programming; WORST-CASE PERFORMANCE; NONCONVEX OPTIMIZATION; PROGRAMMING APPROACH; 1ST-ORDER METHODS; GRADIENT; DECOMPOSITION; MINIMIZATION; COMPLEXITY; SELECTION;
D O I
10.1007/s10957-023-02199-z
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we study the non-asymptotic convergence rate of the DCA(difference-of-convex algorithm), also known as the convex-concave procedure, with two different termination criteria that are suitable for smooth and non-smooth decompositions, respectively. The DCA is a popular algorithm for difference-of-convex (DC) problems and known to converge to a stationary point of the objective under some assumptions. We derive a worst-case convergence rate of O(1/root N) after N iterations of the objective gradient norm for certain classes of DC problems, without assuming strong convexity in the DC decomposition and give an example which shows the convergence rate is exact. We also provide a new convergence rate of O(1/N) for the DCA with the second termination criterion. Moreover, we derive a new linear convergence rate result for the DCA under the assumption of the Polyak-Lojasiewicz inequality. The novel aspect of our analysis is that it employs semidefinite programming performance estimation.
引用
收藏
页码:475 / 496
页数:22
相关论文
共 50 条