Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization

被引:15
作者
Xu, Yangyang [1 ]
Zhang, Shuzhong [2 ]
机构
[1] Rensselaer Polytech Inst, Dept Math Sci, Troy, NY 12180 USA
[2] Univ Minnesota, Dept Ind & Syst Engn, Minneapolis, MN USA
关键词
Primal-dual method; Block coordinate update; Alternating direction method of multipliers (ADMM); Accelerated first-order method; ALTERNATING DIRECTION METHOD; LINEAR CONVERGENCE; DECOMPOSITION; ALGORITHMS; ADMM;
D O I
10.1007/s10589-017-9972-z
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Block coordinate update (BCU) methods enjoy low per-update computational complexity because every time only one or a few block variables would need to be updated among possibly a large number of blocks. They are also easily parallelized and thus have been particularly popular for solving problems involving large-scale dataset and/or variables. In this paper, we propose a primal-dual BCU method for solving linearly constrained convex program with multi-block variables. The method is an accelerated version of a primal-dual algorithm proposed by the authors, which applies randomization in selecting block variables to update and establishes an O(1 / t) convergence rate under convexity assumption. We show that the rate can be accelerated to if the objective is strongly convex. In addition, if one block variable is independent of the others in the objective, we then show that the algorithm can be modified to achieve a linear rate of convergence. The numerical experiments show that the accelerated method performs stably with a single set of parameters while the original method needs to tune the parameters for different datasets in order to achieve a comparable level of performance.
引用
收藏
页码:91 / 128
页数:38
相关论文
共 50 条
  • [31] A Decentralized Primal-Dual Framework for Non-Convex Smooth Consensus Optimization
    Mancino-Ball, Gabriel
    Xu, Yangyang
    Chen, Jie
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2023, 71 : 525 - 538
  • [32] Parallel Primal-Dual Method with Linearization for Structured Convex Optimization
    Zhang, Xiayang
    Tang, Weiye
    Wang, Jiayue
    Zhang, Shiyu
    Zhang, Kangqun
    AXIOMS, 2025, 14 (02)
  • [33] On the Comparison between Primal and Primal-dual Methods in Decentralized Dynamic Optimization
    Xu, Wei
    Yuan, Kun
    Yin, Wotao
    Ling, Qing
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 1501 - 1505
  • [34] Primal-Dual Proximal Splitting and Generalized Conjugation in Non-smooth Non-convex Optimization
    Clason, Christian
    Mazurenko, Stanislav
    Valkonen, Tuomo
    APPLIED MATHEMATICS AND OPTIMIZATION, 2021, 84 (02) : 1239 - 1284
  • [35] Regularized Primal-Dual Subgradient Method for Distributed Constrained Optimization
    Yuan, Deming
    Ho, Daniel W. C.
    Xu, Shengyuan
    IEEE TRANSACTIONS ON CYBERNETICS, 2016, 46 (09) : 2109 - 2118
  • [36] A primal-dual algorithm framework for convex saddle-point optimization
    Zhang, Benxin
    Zhu, Zhibin
    JOURNAL OF INEQUALITIES AND APPLICATIONS, 2017,
  • [37] Exponential convergence of distributed primal-dual convex optimization algorithm without strong convexity
    Liang, Shu
    Wang, Le Yi
    Yin, George
    AUTOMATICA, 2019, 105 : 298 - 306
  • [38] Primal-dual incremental gradient method for nonsmooth and convex optimization problems
    Jalilzadeh, Afrooz
    OPTIMIZATION LETTERS, 2021, 15 (08) : 2541 - 2554
  • [39] A Second Order Primal-Dual Method for Nonsmooth Convex Composite Optimization
    Dhingra, Neil K.
    Khong, Sei Zhen
    Jovanovic, Mihailo R.
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2022, 67 (08) : 4061 - 4076
  • [40] Primal-dual incremental gradient method for nonsmooth and convex optimization problems
    Afrooz Jalilzadeh
    Optimization Letters, 2021, 15 : 2541 - 2554