ACCELERATED FIRST-ORDER PRIMAL-DUAL PROXIMAL METHODS FOR LINEARLY CONSTRAINED COMPOSITE CONVEX PROGRAMMING

被引:65
作者
Xu, Yangyang [1 ]
机构
[1] Univ Alabama, Dept Math, Tuscaloosa, AL 35489 USA
关键词
acceleration; linearization; first-order method; augmented Lagrangian method (ALM); alternating direction method of multipliers (ADMM); OPTIMIZATION; ALGORITHM; MINIMIZATION;
D O I
10.1137/16M1082305
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Motivated by big data applications, first-order methods have been extremely popular in recent years. However, naive gradient methods generally converge slowly. Hence, much effort has been made to accelerate various first-order methods. This paper proposes two accelerated methods towards solving structured linearly constrained convex programming, for which we assume composite convex objective that is the sum of a differentiable function and a possibly nondifferentiable one. The first method is the accelerated linearized augmented Lagrangian method (LALM). At each update to the primal variable, it allows linearization to the differentiable function and also the augmented term, and thus it enables easy subproblems. Assuming merely convexity, we show that LALM owns O(1/t) convergence if parameters are kept fixed during all the iterations and can be accelerated to O(1/t(2)) if the parameters are adapted, where t is the number of total iterations. The second method is the accelerated linearized alternating direction method of multipliers (LADMM). In addition to the composite convexity, it further assumes two-block structure on the objective. Different from classic alternating direction method of multipliers, our method allows linearization to the objective and also augmented term to make the update simple. Assuming strong convexity on one block variable, we show that LADMM also enjoys O(1/t(2)) convergence with adaptive parameters. This result is a significant improvement over that in [Goldstein et. al, SIAM T. Imag. Sci., 7 (2014), pp. 1588-1623], which requires strong convexity on both block variables and no linearization to the objective or augmented term. Numerical experiments are performed on quadratic programming, image denoising, and support vector machine. The proposed accelerated methods are compared to nonaccelerated ones and also existing accelerated methods. The results demonstrate the validity of acceleration and superior performance of the proposed methods over existing ones.
引用
收藏
页码:1459 / 1484
页数:26
相关论文
共 42 条
[11]  
Gabay D., 1976, Computers & Mathematics with Applications, V2, P17, DOI 10.1016/0898-1221(76)90003-1
[12]  
Gao X., 2015, J MATER CHEM C, P1
[13]  
Gao Xiang, 2016, ARXIV160505969
[14]   Accelerated gradient methods for nonconvex nonlinear and stochastic programming [J].
Ghadimi, Saeed ;
Lan, Guanghui .
MATHEMATICAL PROGRAMMING, 2016, 156 (1-2) :59-99
[15]  
Glowinski R., 1975, ESAIM MATH MODEL NUM, V9, P41
[16]   Fast alternating linearization methods for minimizing the sum of two convex functions [J].
Goldfarb, Donald ;
Ma, Shiqian ;
Scheinberg, Katya .
MATHEMATICAL PROGRAMMING, 2013, 141 (1-2) :349-382
[17]   Fast Alternating Direction Optimization Methods [J].
Goldstein, Tom ;
O'Donoghue, Brendan ;
Setzer, Simon ;
Baraniuk, Richard .
SIAM JOURNAL ON IMAGING SCIENCES, 2014, 7 (03) :1588-1623
[18]  
Grant M., 2015, CVX MATLAB SOFTWARE
[19]  
He B., 2010, Optimization Online, P3
[20]   AN ACCELERATED HPE-TYPE ALGORITHM FOR A CLASS OF COMPOSITE CONVEX-CONCAVE SADDLE-POINT PROBLEMS [J].
He, Yunlong ;
Monteiro, Renato D. C. .
SIAM JOURNAL ON OPTIMIZATION, 2016, 26 (01) :29-56