Separable approximations and decomposition methods for the augmented Lagrangian

被引:14
作者
Tappenden, Rachael [1 ]
Richatrik, Peter [1 ]
Bueke, Burak [1 ]
机构
[1] Univ Edinburgh, Sch Math, JCMB, Edinburgh EH9 3JZ, Midlothian, Scotland
基金
英国工程与自然科学研究理事会;
关键词
augmented Lagrangian; separable approximations; (block) coordinate descent methods; iteration complexity; OPTIMIZATION; CONVERGENCE; ALGORITHM; HESTENES;
D O I
10.1080/10556788.2014.966824
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
In this paper, we study decomposition methods based on separable approximations for minimizing the augmented Lagrangian. In particular, we study and compare the diagonal quadratic approximation method (DQAM) of Mulvey and Ruszczyski [A diagonal quadratic approximation method for large scale linear programs, Oper. Res. Lett. 12 (1992), pp. 205-215] and the parallel coordinate descent method (PCDM) of Richtarik and Taka [Parallel coordinate descent methods for big data optimization. Technical report, November 2012. arXiv:1212.0873]. We show that the two methods are equivalent for feasibility problems up to the selection of a step-size parameter. Furthermore, we prove an improved complexity bound for PCDM under strong convexity, and show that this bound is at least 8(L/L)(-1)(2) times better than the best known bound for DQAM, where is the degree of partial separability and L' and L are the maximum and average of the block Lipschitz constants of the gradient of the quadratic penalty appearing in the augmented Lagrangian.
引用
收藏
页码:643 / 668
页数:26
相关论文
共 26 条
[1]  
Berger A.J., 1994, SIAM J. Optim, V4, P735, DOI 10.1137/0804043
[2]   Extended Monotropic Programming and Duality [J].
Bertsekas, D. P. .
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2008, 139 (02) :209-225
[3]  
Bertsekas D.P, 1996, Constrained Optimization and Lagrange Multiplier Methods, V1
[4]   Improving ultimate convergence of an augmented Lagrangian method [J].
Birgin, E. G. ;
Martinez, J. M. .
OPTIMIZATION METHODS & SOFTWARE, 2008, 23 (02) :177-195
[5]   Block coordinate descent algorithms for large-scale sparse multiclass classification [J].
Blondel, Mathieu ;
Seki, Kazuhiro ;
Uehara, Kuniaki .
MACHINE LEARNING, 2013, 93 (01) :31-52
[6]   ON THE DOUGLAS-RACHFORD SPLITTING METHOD AND THE PROXIMAL POINT ALGORITHM FOR MAXIMAL MONOTONE-OPERATORS [J].
ECKSTEIN, J ;
BERTSEKAS, DP .
MATHEMATICAL PROGRAMMING, 1992, 55 (03) :293-318
[7]   Properties of an augmented Lagrangian for design optimization [J].
Hamdi, Adel ;
Griewank, Andreas .
OPTIMIZATION METHODS & SOFTWARE, 2010, 25 (04) :645-664
[8]   ALTERNATING DIRECTION METHOD WITH GAUSSIAN BACK SUBSTITUTION FOR SEPARABLE CONVEX PROGRAMMING [J].
He, Bingsheng ;
Tao, Min ;
Yuan, Xiaoming .
SIAM JOURNAL ON OPTIMIZATION, 2012, 22 (02) :313-340
[9]  
Hestenes M. R., 1969, Journal of Optimization Theory and Applications, V4, P303, DOI 10.1007/BF00927673
[10]   COORDINATE DESCENT OPTIMIZATION FOR l1 MINIMIZATION WITH APPLICATION TO COMPRESSED SENSING; A GREEDY ALGORITHM [J].
Li, Yingying ;
Osher, Stanley .
INVERSE PROBLEMS AND IMAGING, 2009, 3 (03) :487-503