Distributed Proportional Stochastic Coordinate Descent With Social Sampling

被引:0
|
作者
Ghassemi, Mohsen [1 ]
Sarwate, Anand D. [1 ]
机构
[1] Rutgers State Univ, Dept Elect & Comp Engn, Piscataway, NJ 08854 USA
来源
2015 53RD ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON) | 2015年
关键词
OPTIMIZATION;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider stochastic message passing algorithms that limit the communication required for decentralized and distributed convex optimization and provide convergence guarantees on the objective value. We first propose a centralized method that modifies the coordinate-sampling distribution for stochastic coordinate descent, which we call proportional stochastic coordinate descent. This method treats the gradient of the function as a probability distribution to sample the coordinates, and may be useful in so-called lock-free decentralized optimization schemes. For general distributed optimization in which agents jointly minimize the sum of local objectives, we propose treating the iterates as gradients and propose a stochastic coordinate-wise primal averaging algorithm for optimization.
引用
收藏
页码:17 / 24
页数:8
相关论文
共 50 条
  • [41] Distributed Stochastic Gradient Descent Using LDGM Codes
    Horii, Shunsuke
    Yoshida, Takahiro
    Kobayashi, Manabu
    Matsushima, Toshiyasu
    2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 1417 - 1421
  • [42] Communication-Censored Distributed Stochastic Gradient Descent
    Li, Weiyu
    Wu, Zhaoxian
    Chen, Tianyi
    Li, Liping
    Ling, Qing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (11) : 6831 - 6843
  • [43] Accelerate Distributed Stochastic Descent for Nonconvex Optimization with Momentum
    Cong, Guojing
    Liu, Tianyi
    2020 IEEE/ACM WORKSHOP ON MACHINE LEARNING IN HIGH PERFORMANCE COMPUTING ENVIRONMENTS (MLHPC 2020) AND WORKSHOP ON ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING FOR SCIENTIFIC APPLICATIONS (AI4S 2020), 2020, : 29 - 39
  • [44] PARALLEL STOCHASTIC ASYNCHRONOUS COORDINATE DESCENT: TIGHT BOUNDS ON THE POSSIBLE PARALLELISM
    Cheung, Yun Kuen
    Cole, Richard J.
    Tao, Yixin
    SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (01) : 448 - 460
  • [45] The coordinate descent method with stochastic optimization for linear support vector machines
    Zheng, Tianyou
    Liang, Xun
    Cao, Run
    NEURAL COMPUTING & APPLICATIONS, 2013, 22 (7-8): : 1261 - 1266
  • [46] The coordinate descent method with stochastic optimization for linear support vector machines
    Tianyou Zheng
    Xun Liang
    Run Cao
    Neural Computing and Applications, 2013, 22 : 1261 - 1266
  • [47] Block-cyclic stochastic coordinate descent for deep neural networks
    Nakamura, Kensuke
    Soatto, Stefano
    Hong, Byung-Woo
    NEURAL NETWORKS, 2021, 139 : 348 - 357
  • [48] Accelerated Randomized Coordinate Descent Algorithms for Stochastic Optimization and Online Learning
    Bhandari, Akshita
    Singh, Chandramani
    LEARNING AND INTELLIGENT OPTIMIZATION, LION 12, 2019, 11353 : 1 - 15
  • [49] When Cyclic Coordinate Descent Outperforms Randomized Coordinate Descent
    Gurbuzbalaban, Mert
    Ozdaglar, Asuman
    Parrilo, Pablo A.
    Vanli, N. Denizcan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [50] FAST DISTRIBUTED COORDINATE DESCENT FOR NON-STRONGLY CONVEX LOSSES
    Fercoq, Olivier
    Qu, Zheng
    Richtarik, Peter
    Takac, Martin
    2014 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2014,