Distributed Newton Method for Large-Scale Consensus Optimization

被引:30
作者
Tutunov, Rasul [1 ]
Bou-Ammar, Haitham [1 ]
Jadbabaie, Ali [2 ]
机构
[1] Huawei Technol Res & Dev, Shenzhen, Peoples R China
[2] MIT, Inst Data Syst & Soc, 77 Massachusetts Ave, Cambridge, MA 02139 USA
关键词
Convergence; distributed algorithms; SUBGRADIENT METHODS; ALGORITHM; DESCENT;
D O I
10.1109/TAC.2019.2907711
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose a distributed Newton method for decenteralized optimization of large sums of convex functions. Our proposed method is based on creating a set of separable finite sum minimization problems by utilizing a decomposition technique known as Global Consensus that distributes the computation across nodes of a graph and enforces a consensus constraint among the separated variables. The key idea is to exploit the sparsity of the dual Hessian and recast the computation of the Newton step as one of the efficiently solving symmetric diagonally dominant linear equations. We show that our method outperforms the state-of-the-art algorithms, including ADMM. We validate our algorithm both theoretically and empirically. On the theory side, we demonstrate that our algorithm exhibits superlinear convergence within a neighborhood of optimality. Empirically, we show the superiority of this new method on a variety of large-scale optimization problems. The proposed approach is scalable to large problems and has a low communication overhead.
引用
收藏
页码:3983 / 3994
页数:12
相关论文
共 24 条
  • [1] Ammar HB, 2015, PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), P3345
  • [2] [Anonymous], 2015, ARXIV150406017
  • [3] Bertsekas DP., 1989, Parallel and Distributed Computation: Numerical Methods
  • [4] A Coordinate Descent Primal-Dual Algorithm and Application to Distributed Asynchronous Optimization
    Bianchi, Pascal
    Hachem, Walid
    Iutzeler, Franck
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2016, 61 (10) : 2947 - 2957
  • [5] Eckstein, 2011, FDN TRENDS MACH LEAR, V3, P1, DOI DOI 10.1561/2200000016
  • [6] CONVERGENCE RATES OF SUBGRADIENT OPTIMIZATION METHODS
    GOFFIN, JL
    [J]. MATHEMATICAL PROGRAMMING, 1977, 13 (03) : 329 - 347
  • [7] A globally convergent incremental Newton method
    Guerbuzbalaban, M.
    Ozdaglar, A.
    Parrilo, P.
    [J]. MATHEMATICAL PROGRAMMING, 2015, 151 (01) : 283 - 313
  • [8] Jha M., 2011, SIAM J COMPUTAT, V42, P700
  • [9] Accelerated Alternating Direction Method of Multipliers
    Kadkhodaie, Mojtaba
    Christakopoulou, Konstantina
    Sanjabi, Maziar
    Banerjee, Arindam
    [J]. KDD'15: PROCEEDINGS OF THE 21ST ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2015, : 497 - 506
  • [10] Kumar N, 2012, PR ELECTROMAGN RES S, P1725