Communication-Efficient Federated Learning via Predictive Coding

被引:10
作者
Yue, Kai [1 ]
Jin, Richeng [1 ]
Wong, Chau-Wai [1 ]
Dai, Huaiyu [1 ]
机构
[1] NC State Univ, Dept Elect & Comp Engn, Raleigh, NC 27695 USA
基金
美国国家科学基金会;
关键词
Predictive models; Servers; Collaborative work; Predictive coding; Entropy coding; Costs; Quantization (signal); Federated learning; distributed optimization; predictive coding;
D O I
10.1109/JSTSP.2022.3142678
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning can enable remote workers to collaboratively train a shared machine learning model while allowing training data to be kept locally. In the use case of wireless mobile devices, the communication overhead is a critical bottleneck due to limited power and bandwidth. Prior work has utilized various data compression tools such as quantization and sparsification to reduce the overhead. In this paper, we propose a predictive coding based compression scheme for federated learning. The scheme has shared prediction functions among all devices and allows each worker to transmit a compressed residual vector derived from the reference. In each communication round, we select the predictor and quantizer based on the rate-distortion cost, and further reduce the redundancy with entropy coding. Extensive simulations reveal that the communication cost can be reduced up to 99% with even better learning performance when compared with other baseline methods.
引用
收藏
页码:369 / 380
页数:12
相关论文
共 32 条
[31]   Impact Analysis of Baseband Quantizer on Coding Efficiency for HDR Video [J].
Wong, Chau-Wai ;
Su, Guan-Ming ;
Wu, Min .
IEEE SIGNAL PROCESSING LETTERS, 2016, 23 (10) :1354-1358
[32]  
Xiao H., 2017, arXiv