RPN: A Residual Pooling Network for Efficient Federated Learning

被引:6
作者
Huang, Anbu [1 ]
Chen, Yuanyuan [2 ]
Liu, Yang [1 ]
Chen, Tianjian [1 ]
Yang, Qiang [3 ]
机构
[1] Webank AI Lab, Shenzhen, Guangdong, Peoples R China
[2] Nanyang Technol Univ, Singapore, Singapore
[3] Hong Kong Univ Sci & Technol, Hong Kong, Peoples R China
来源
ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE | 2020年 / 325卷
关键词
D O I
10.3233/FAIA200222
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning is a distributed machine learning framework which enables different parties to collaboratively train a model while protecting data privacy and security. Due to model complexity, network unreliability and connection in-stability, communication cost has became a major bottleneck for applying federated learning to real-world applications. Current existing strategies are either need to manual setting for hyperparameters, or break up the original process into multiple steps, which make it hard to realize end-to-end implementation. In this paper, we propose a novel compression strategy called Residual Pooling Network (RPN). Our experiments show that RPN not only reduce data transmission effectively, but also achieve almost the same performance as compared to standard federated learning. Our new approach performs as an end-to-end procedure, which should be readily applied to all CNN-based model training scenarios for improvement of communication efficiency, and hence make it easy to deploy in real-world application without much human intervention.
引用
收藏
页码:1223 / 1229
页数:7
相关论文
共 32 条
[1]  
Aji A. F., 2017, EMNLP 2017, P440
[2]  
[Anonymous], 2015, 16 ANN C INT SPEECH
[3]  
[Anonymous], 2018, P USENIX WORKSHOP HO
[4]  
[Anonymous], 2017, CORR
[5]  
[Anonymous], ICML
[6]  
[Anonymous], 2010, Commun. Surveys Tuts., DOI DOI 10.1038/nature14539
[7]  
[Anonymous], 2016, ARXIV PREPR ARXIV160, DOI DOI 10.48550/ARXIV.1609.03499
[8]  
[Anonymous], 2016, ICLR
[9]  
Bernstein Jeremy, 2018, CORR
[10]  
Bernstein Jeremy, 2018, CORR