A partial PPA block-wise ADMM for multi-block linearly constrained separable convex optimization

被引:8
|
作者
Shen, Yuan [1 ]
Zhang, Xingying [1 ]
Zhang, Xiayang [2 ]
机构
[1] Nanjing Univ Finance & Econ, Sch Appl Math, Nanjing, Peoples R China
[2] Nanjing Inst Technol, Dept Math & Phys, Nanjing, Peoples R China
基金
中国国家自然科学基金;
关键词
Convex optimization; augmented Lagrangian; alternating direction method of multipliers; multi-block; proximal point algorithm; AUGMENTED LAGRANGIAN METHOD; PARALLEL SPLITTING METHOD; LOW-RANK; MINIMIZATION; SPARSE; CONVERGENCE; ALGORITHM;
D O I
10.1080/02331934.2020.1728756
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
The alternating direction method of multipliers (ADMM) is a classical effective method for solving two-block convex optimization subject to linear constraints. However, its convergence may not be guaranteed for multiple-block case without additional assumptions. One remedy might be the block-wise ADMM (BADMM), in which the variables are regrouped into two groups firstly and then the augmented Lagrangian function is minimized w.r.t. each block variable by the following scheme: using a Gauss-Seidel fashion to update the variables between each group, while using a Jacobi fashion to update the variables within each group. In order to derive its convergence property, a special proximal term is added to each subproblem. In this paper, we propose a new partial PPA block-wise ADMM where we only need to add proximal terms to the subproblems in the first group. At the end of each iteration, an extension step on all variables is performed with a fixed step size. As the subproblems in the second group are unmodified, the resulting sequence might yield better quality as well as potentially faster convergence speed. Preliminary experimental results show that the new algorithm is empirically effective on solving both synthetic and real problems when it is compared with several very efficient ADMM-based algorithms.
引用
收藏
页码:631 / 657
页数:27
相关论文
共 50 条
  • [41] A Homotopy Alternating Direction Method of Multipliers for Linearly Constrained Separable Convex Optimization
    Yang J.
    Dai Y.-Q.
    Peng Z.
    Zhuang J.-P.
    Zhu W.-X.
    Journal of the Operations Research Society of China, 2017, 5 (2) : 271 - 290
  • [42] Randomized Neural Networks Based Decentralized Multi-Task Learning via Hybrid Multi-Block ADMM
    Ye, Yu
    Xiao, Ming
    Skoglund, Mikael
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 2844 - 2857
  • [43] A linear algebra perspective on the random multi-block ADMM: the QP case
    Stefano Cipolla
    Jacek Gondzio
    Calcolo, 2023, 60
  • [44] PROBLEM DECOMPOSITION IN BLOCK-SEPARABLE CONVEX OPTIMIZATION: IDEAS OLD AND NEW
    Rockafellar, R. Tyrrell
    JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2018, 19 (09) : 1459 - 1474
  • [45] A linear algebra perspective on the random multi-block ADMM: the QP case
    Cipolla, Stefano
    Gondzio, Jacek
    CALCOLO, 2023, 60 (04)
  • [46] A MAJORIZED ADMM WITH INDEFINITE PROXIMAL TERMS FOR LINEARLY CONSTRAINED CONVEX COMPOSITE OPTIMIZATION
    Li, Min
    Sun, Defeng
    Toh, Kim-Chuan
    SIAM JOURNAL ON OPTIMIZATION, 2016, 26 (02) : 922 - 950
  • [47] Convergence analysis of generalized ADMM with majorization for linearly constrained composite convex optimization
    Hongwu Li
    Haibin Zhang
    Yunhai Xiao
    Peili Li
    Optimization Letters, 2024, 18 : 1173 - 1200
  • [48] Linearized symmetric multi-block ADMM with indefinite proximal regularization and optimal proximal parameter
    Chang, Xiaokai
    Bai, Jianchao
    Song, Dunjiang
    Liu, Sanyang
    CALCOLO, 2020, 57 (04)
  • [49] Distributed Multi-block Partially Symmetric Bregman ADMM for Nonconvex and Nonsmooth Sharing Problem
    Cui, Tian-Tian
    Dang, Ya-Zheng
    Gao, Yan
    JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2025,
  • [50] On the convergence of the direct extension of ADMM for three-block separable convex minimization models with one strongly convex function
    Cai, Xingju
    Han, Deren
    Yuan, Xiaoming
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2017, 66 (01) : 39 - 73