Randomized Primal-Dual Proximal Block Coordinate Updates

被引:16
作者
Gao, Xiang [1 ]
Xu, Yang-Yang [2 ]
Zhang, Shu-Zhong [1 ]
机构
[1] Univ Minnesota, Dept Ind & Syst Engn, Minneapolis, MN USA
[2] Rensselaer Polytech Inst, Dept Math Sci, Troy, NY 12180 USA
基金
美国国家科学基金会;
关键词
Primal-dual method; Alternating direction method of multipliers (ADMM); Randomized algorithm; Iteration complexity; First-order stochastic approximation;
D O I
10.1007/s40305-018-0232-4
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we propose a randomized primal-dual proximal block coordinate updating framework for a general multi-block convex optimization model with coupled objective function and linear constraints. Assuming mere convexity, we establish its O(1/t) convergence rate in terms of the objective value and feasibility measure. The framework includes several existing algorithms as special cases such as a primal-dual method for bilinear saddle-point problems (PD-S), the proximal Jacobian alternating direction method of multipliers (Prox-JADMM) and a randomized variant of the ADMM for multi-block convex optimization. Our analysis recovers and/or strengthens the convergence properties of several existing algorithms. For example, for PD-S our result leads to the same order of convergence rate without the previously assumed boundedness condition on the constraint sets, and for Prox-JADMM the new result provides convergence rate in terms of the objective value and the feasibility violation. It is well known that the original ADMM may fail to converge when the number of blocks exceeds two. Our result shows that if an appropriate randomization procedure is invoked to select the updating blocks, then a sublinear rate of convergence in expectation can be guaranteed for multi-block ADMM, without assuming any strong convexity. The new approach is also extended to solve problems where only a stochastic approximation of the subgradient of the objective is available, and we establish an O(1/convergence rate of the extended approach for solving stochastic programming.
引用
收藏
页码:205 / 250
页数:46
相关论文
共 47 条
  • [1] [Anonymous], TECHNICAL REPORT
  • [2] [Anonymous], J SCI COMPUT
  • [3] [Anonymous], 2015, ARXIV151106324
  • [4] [Anonymous], 2014, ARXIV14017079
  • [5] LOCAL LINEAR CONVERGENCE OF THE ALTERNATING DIRECTION METHOD OF MULTIPLIERS ON QUADRATIC OR LINEAR PROGRAMS
    Boley, Daniel
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2013, 23 (04) : 2183 - 2207
  • [6] On the convergence of the direct extension of ADMM for three-block separable convex minimization models with one strongly convex function
    Cai, Xingju
    Han, Deren
    Yuan, Xiaoming
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2017, 66 (01) : 39 - 73
  • [7] A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
    Chambolle, Antonin
    Pock, Thomas
    [J]. JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2011, 40 (01) : 120 - 145
  • [8] Chen C., 2015, ARXIV150800193
  • [9] Chen C, 2013, ABSTR APPL AN, V2013
  • [10] The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent
    Chen, Caihua
    He, Bingsheng
    Ye, Yinyu
    Yuan, Xiaoming
    [J]. MATHEMATICAL PROGRAMMING, 2016, 155 (1-2) : 57 - 79