On the Fundamental Limits of Coded Data Shuffling for Distributed Machine Learning

被引:12
作者
Elmahdy, Adel [1 ]
Mohajer, Soheil [1 ]
机构
[1] Univ Minnesota, Dept Elect & Comp Engn, Minneapolis, MN 55455 USA
基金
美国国家科学基金会;
关键词
Encoding; Distributed databases; Indexes; Machine learning; Task analysis; Training; Data shuffling; coded caching; distributed computing; distributed machine learning; ASSIGNMENT;
D O I
10.1109/TIT.2020.2964547
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider the data shuffling problem in a distributed learning system, in which a master node is connected to a set of worker nodes, via a shared link, in order to communicate a set of files to the worker nodes. The master node has access to a database of files. In every shuffling iteration, each worker node processes a new subset of files, and has excess storage to partially cache the remaining files, assuming the cached files are uncoded. The caches of the worker nodes are updated every iteration, and they should be designed to satisfy any possible unknown permutation of the files in subsequent iterations. For this problem, we characterize the exact load-memory trade-off for worst-case shuffling by deriving the minimum communication load for a given storage capacity per worker node. As a byproduct, the exact load-memory trade-off for any shuffling is characterized when the number of files is equal to the number of worker nodes. We propose a novel deterministic coded shuffling scheme, which improves the state of the art, by exploiting the cache memories to create coded functions that can be decoded by several worker nodes. Then, we prove the optimality of our proposed scheme by deriving a matching lower bound and showing that the placement phase of the proposed coded shuffling scheme is optimal over all shuffles.
引用
收藏
页码:3098 / 3131
页数:34
相关论文
共 32 条
[1]  
[Anonymous], P ADV NEUR INF PROC
[2]  
[Anonymous], IEEE T INF THEORY
[3]  
[Anonymous], 2015, ARXIV151008560
[4]  
[Anonymous], P C LEARN THEOR COLT
[5]  
[Anonymous], 2016, P IEEE GLOB COMM C G
[6]  
[Anonymous], 2016, 2016 IEEE POWER ENER
[7]  
[Anonymous], 1935, J. London. Math. Soc., DOI [10.1112/jlms/s1-10.37.26, DOI 10.1112/JLMS/S1-10.37.26]
[8]   Near Optimal Coded Data Shuffling for Distributed Learning [J].
Attia, Mohamed Adel ;
Tandon, Ravi .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2019, 65 (11) :7325-7349
[9]  
Attia MA, 2016, ANN ALLERTON CONF, P961, DOI 10.1109/ALLERTON.2016.7852338
[10]   Index Coding With Side Information [J].
Bar-Yossef, Ziv ;
Birk, Yitzhak ;
Jayram, T. S. ;
Kol, Tomer .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2011, 57 (03) :1479-1494