Generalized alternating direction method of multipliers: new theoretical insights and applications

被引:78
作者
Fang E.X. [1 ]
He B. [2 ]
Liu H. [1 ]
Yuan X. [3 ]
机构
[1] Department of Operations Research and Financial Engineering, Princeton University, Princeton, 08544, NJ
[2] International Centre of Management Science and Engineering, and Department of Mathematics, Nanjing University, Nanjing
[3] Department of Mathematics, Hong Kong Baptist University, Kowloon
基金
中国国家自然科学基金;
关键词
Alternating direction method of multipliers; Convergence rate; Convex optimization; Discriminant analysis; Statistical learning; Variable selection;
D O I
10.1007/s12532-015-0078-2
中图分类号
学科分类号
摘要
Recently, the alternating direction method of multipliers (ADMM) has received intensive attention from a broad spectrum of areas. The generalized ADMM (GADMM) proposed by Eckstein and Bertsekas is an efficient and simple acceleration scheme of ADMM. In this paper, we take a deeper look at the linearized version of GADMM where one of its subproblems is approximated by a linearization strategy. This linearized version is particularly efficient for a number of applications arising from different areas. Theoretically, we show the worst-case $${mathcal {O}}(1/k)$$O(1/k) convergence rate measured by the iteration complexity ($$k$$k represents the iteration counter) in both the ergodic and a nonergodic senses for the linearized version of GADMM. Numerically, we demonstrate the efficiency of this linearized version of GADMM by some rather new and core applications in statistical learning. Code packages in Matlab for these applications are also developed. © 2015, Springer-Verlag Berlin Heidelberg and The Mathematical Programming Society.
引用
收藏
页码:149 / 187
页数:38
相关论文
共 51 条
  • [11] Deng W., Yin W., On the global and linear convergence of the generalized alternating direction method of multipliers, Manuscript, (2012)
  • [12] Eckstein J., Parallel alternating direction multiplier decomposition of convex programs, J. Optim. Theory Appli., 80, 1, pp. 39-62, (1994)
  • [13] Eckstein J., Yao W., Augmented Lagrangian and alternating direction methods for convex optimization: a tutorial and some illustrative computational results, RUTCOR Research Report RRR, pp. 32-2012, (2012)
  • [14] Eckstein J., Bertsekas D., On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators, Math. Program., 55, pp. 293-318, (1992)
  • [15] Fan J., Fan Y., High dimensional classification using features annealed independence rules, Ann. Stat., 36, pp. 2037-2605, (2008)
  • [16] Fan J., Li R., Variable selection via nonconcave penalized likelihood and its oracle properties, J. Am. Stat. Assoc., 96, pp. 1348-1360, (2001)
  • [17] Fan J., Feng Y., Tong X., A road to classification in high dimensional space: the regularized optimal affine discriminant, J. R. Stat. Soc. Series B Stat. Methodol., 74, pp. 745-771, (2012)
  • [18] Fan J., Zhang J., Yu K., Vast portfolio selection with gross-exposure constraints, J. Am. Stat. Assoc., 107, pp. 592-606, (2012)
  • [19] Fazeland M., Hindi H., Boyd S., A rank minimization heuristic with application to minimum order system approximation. Proc. Am, Control Conf, (2001)
  • [20] Fortin M., Glowinski R., Augmented Lagrangian methods: applications to the numerical solutions of boundary value problems Stud. Math. Appl. 15, (1983)