DSCOVR: Randomized primal-dual block coordinate algorithms for asynchronous distributed optimization

被引:0
|
作者
Xiao, Lin [1 ]
Yu, Adams Wei [1 ]
Lin, Qihang [2 ]
Chen, Weizhu [3 ]
机构
[1] Microsoft Research AI, Redmond,WA,98052, United States
[2] Machine Learning Department, Carnegie Mellon University, Pittsburgh,PA,15213, United States
[3] Tippie College of Business, University of Iowa, Iowa City,IA,52245, United States
来源
Journal of Machine Learning Research | 2019年 / 20卷
关键词
Stochastic systems;
D O I
暂无
中图分类号
学科分类号
摘要
Machine learning with big data often involves large optimization models. For distributed optimization over a cluster of machines, frequent communication and synchronization of all model parameters (optimization variables) can be very costly. A promising solution is to use parameter servers to store different subsets of the model parameters, and update them asynchronously at different machines using local datasets. In this paper, we focus on distributed optimization of large linear models with convex loss functions, and propose a family of randomized primal-dual block coordinate algorithms that are especially suitable for asynchronous distributed implementation with parameter servers. In particular, we work with the saddle-point formulation of such problems which allows simultaneous data and model partitioning, and exploit its structure by doubly stochastic coordinate optimization with variance reduction (DSCOVR). Compared with other first-order distributed algorithms, we show that DSCOVR may require less amount of overall computation and communication, and less or no synchronization. We discuss the implementation details of the DSCOVR algorithms, and present numerical experiments on an industrial distributed computing system. © 2019 Lin Xiao, Adams Wei Yu, Qihang Lin and Weizhu Chen.
引用
收藏
相关论文
共 50 条
  • [41] A Primal-Dual SGD Algorithm for Distributed Nonconvex Optimization
    Xinlei Yi
    Shengjun Zhang
    Tao Yang
    Tianyou Chai
    Karl Henrik Johansson
    IEEE/CAA Journal of Automatica Sinica, 2022, 9 (05) : 812 - 833
  • [42] Randomized Primal–Dual Proximal Block Coordinate Updates
    Xiang Gao
    Yang-Yang Xu
    Shu-Zhong Zhang
    Journal of the Operations Research Society of China, 2019, 7 : 205 - 250
  • [43] Fast primal-dual distributed algorithms for scheduling and matching problems
    Alessandro Panconesi
    Mauro Sozio
    Distributed Computing, 2010, 22 : 269 - 283
  • [44] Fast primal-dual distributed algorithms for scheduling and matching problems
    Panconesi, Alessandro
    Sozio, Mauro
    DISTRIBUTED COMPUTING, 2010, 22 (04) : 269 - 283
  • [45] Asynchronous Distributed Optimization via Dual Decomposition and Block Coordinate Subgradient Methods
    Lin, Yankai
    Shames, Iman
    Nesic, Dragan
    IEEE TRANSACTIONS ON CONTROL OF NETWORK SYSTEMS, 2021, 8 (03): : 1348 - 1359
  • [46] Distributed Primal-Dual Optimization for Non-uniformly Distributed Data
    Cheng, Minhao
    Hsieh, Cho-Jui
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2028 - 2034
  • [47] Proximal nested primal-dual gradient algorithms for distributed constraint-coupled composite optimization
    Li, Jingwang
    An, Qing
    Su, Housheng
    APPLIED MATHEMATICS AND COMPUTATION, 2023, 444
  • [48] Solution to DC Optimal Power Flow Problems with Demand Response via Distributed Asynchronous Primal-Dual Algorithms
    Masuda E.
    Matsuda Y.
    Wakasa Y.
    Adachi R.
    SICE Journal of Control, Measurement, and System Integration, 2020, 13 (03) : 66 - 72
  • [49] Quantized Primal-Dual Algorithms for Network Optimization With Linear Convergence
    Chen, Ziqin
    Liang, Shu
    Li, Li
    Cheng, Shuming
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2024, 69 (01) : 471 - 478
  • [50] Primal-Dual Algorithms for Convex Optimization via Regret Minimization
    Nam Ho-Nguyen
    Kilinc-Karzan, Fatma
    IEEE CONTROL SYSTEMS LETTERS, 2018, 2 (02): : 284 - 289