Communication-Efficient Distributed Learning of Discrete Probability Distributions

被引:0
作者
Diakonikolas, Ilias [1 ]
Grigorescu, Elena [2 ]
Li, Jerry [3 ]
Natarajan, Abhiram [2 ]
Onak, Krzysztof [4 ]
Schmidt, Ludwig [3 ]
机构
[1] USC, CS, Los Angeles, CA 90007 USA
[2] Purdue Univ, CS, W Lafayette, IN 47907 USA
[3] MIT, EECS & CSAIL, Cambridge, MA 02139 USA
[4] IBM Res Corp, Albany, NY USA
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017) | 2017年 / 30卷
关键词
DENSITY-ESTIMATION; MULTIVARIATE HISTOGRAMS; ALGORITHMS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We initiate a systematic investigation of distribution learning (density estimation) when the data is distributed across multiple servers. The servers must communicate with a referee and the goal is to estimate the underlying distribution with as few bits of communication as possible. We focus on non-parametric density estimation of discrete distributions with respect to the l(1) and ,l(2) norms. We provide the first non-trivial upper and lower bounds on the communication complexity of this basic estimation task in various settings of interest. Specifically, our results include the following: 1. When the unknown discrete distribution is unstructured and each server has only one sample, we show that any blackboard protocol (i.e., any protocol in which servers interact arbitrarily using public messages) that learns the distribution must essentially communicate the entire sample. 2. For the case of structured distributions, such as k-histograms and monotone distributions, we design distributed learning algorithms that achieve significantly better communication guarantees than the naive ones, and obtain tight upper and lower bounds in several regimes. Our distributed learning algorithms run in near-linear time and are robust to model misspecification. Our results provide insights on the interplay between structure and communication efficiency for a range of fundamental distribution estimation tasks.
引用
收藏
页数:11
相关论文
共 41 条
[1]  
Acharya J, 2017, PROCEEDINGS OF THE TWENTY-EIGHTH ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, P1278
[2]   Fast and Near-Optimal Algorithms for Approximating Distributions by Histograms [J].
Acharya, Jayadev ;
Diakonikolas, Ilias ;
Hegde, Chinmay ;
Li, Jerry ;
Schmidt, Ludwig .
PODS'15: PROCEEDINGS OF THE 33RD ACM SYMPOSIUM ON PRINCIPLES OF DATABASE SYSTEMS, 2015, :249-263
[3]  
[Anonymous], 2016, Handbook of Big Data
[4]  
[Anonymous], ARXIV E PRINTS
[5]  
[Anonymous], 2016, CORR
[6]  
[Anonymous], 2014, NIPS
[7]  
[Anonymous], 2012, SODA
[8]  
[Anonymous], FRONT MASS DAT AN
[9]  
[Anonymous], 1984, C MODERN ANAL PROBAB
[10]  
[Anonymous], 2013, Advances in Neural Information Processing Systems