Hierarchical Broadcast Coding: Expediting Distributed Learning at the Wireless Edge

被引:7
作者
Han, Dong-Jun [1 ]
Sohn, Jy-Yong [1 ]
Moon, Jaekyun [1 ]
机构
[1] Korea Adv Inst Sci & Technol, Sch Elect Engn, Daejeon 34141, South Korea
基金
新加坡国家研究基金会;
关键词
Encoding; Wireless communication; Training; Distributed databases; Computational modeling; Servers; Data models; Distributed learning; gradient descent; wireless edge; stragglers;
D O I
10.1109/TWC.2020.3040792
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Distributed learning plays a key role in reducing the training time of modern deep neural networks with massive datasets. In this article, we consider a distributed learning problem where gradient computation is carried out over a number of computing devices at the wireless edge. We propose hierarchical broadcast coding, a provable coding-theoretic framework to speed up distributed learning at the wireless edge. Our contributions are threefold. First, motivated by the hierarchical nature of real-world edge computing systems, we propose a layered code which mitigates the effects of not only packet losses at the wireless computing nodes but also straggling access points (APs) or small base stations. Second, by strategically allocating data partitions to nodes in the overlapping areas between cells, our technique achieves the fundamental lower bound on computational load to combat stragglers. Finally, we take advantage of the broadcast nature of wireless networks by which wireless devices in the overlapping cell coverage broadcast to more than one AP. This further reduces the overall training time in the presence of straggling APs. Experimental results on Amazon EC2 confirm the advantage of the proposed methods in speeding up learning. Our design targets any gradient descent based learning algorithms, including linear/logistic regressions and deep learning.
引用
收藏
页码:2266 / 2281
页数:16
相关论文
共 45 条
  • [1] Network Densification: The Dominant Theme for Wireless Evolution into 5G
    Bhushan, Naga
    Li, Junyi
    Malladi, Durga
    Gilmore, Rob
    Brenner, Dean
    Damnjanovic, Aleksandar
    Sukhavasi, Ravi Teja
    Patel, Chirag
    Geirhofer, Stefan
    [J]. IEEE COMMUNICATIONS MAGAZINE, 2014, 52 (02) : 82 - 89
  • [2] Femtocell Networks: A Survey
    Chandrasekhar, Vikram
    Andrews, Jeffrey G.
    Gatherer, Alan
    [J]. IEEE COMMUNICATIONS MAGAZINE, 2008, 46 (09) : 59 - 67
  • [3] Charles Z., 2017, Approximate gradient coding via sparse random graphs
  • [4] A prospective study of transmission of Multidrug-Resistant Organisms (MDROs) between environmental sites and hospitalized patientsthe TransFER study
    Chen, Luke F.
    Knelson, Lauren P.
    Gergen, Maria F.
    Better, Olga M.
    Nicholson, Bradly P.
    Woods, Christopher W.
    Rutala, William A.
    Weber, David J.
    Sexton, Daniel J.
    Anderson, Deverick J.
    [J]. INFECTION CONTROL AND HOSPITAL EPIDEMIOLOGY, 2019, 40 (01) : 47 - 52
  • [5] Wireless Distributed Computing: A Survey of Research Challenges
    Datla, Dinesh
    Chen, Xuetao
    Tsou, Thomas
    Raghunandan, Sahana
    Hasan, S. M. Shajedul
    Reed, Jeffrey H.
    Dietrich, Carl B.
    Bose, Tamal
    Fette, Bruce
    Kim, Jeong-Ho
    [J]. IEEE COMMUNICATIONS MAGAZINE, 2012, 50 (01) : 144 - 152
  • [6] Dean J., 2012, NIPS, V2012, P1223, DOI [10.5555/2999134.2999271, DOI 10.5555/2999134.2999271]
  • [7] High-Dimensional Stochastic Gradient Quantization for Communication-Efficient Edge Learning
    Du, Yuqing
    Yang, Sheng
    Huang, Kaibin
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 (68) : 2128 - 2142
  • [8] Dutta S., 2016, P NIPS, P2100
  • [9] Dutta S, 2017, IEEE INT SYMP INFO, P2403, DOI 10.1109/ISIT.2017.8006960
  • [10] Ferdinand N., 2017, P 16 IEEE INT C MACH