FedSVRG Based Communication Efficient Scheme for Federated Learning in MEC Networks

被引:14
作者
Chen, Dawei [1 ]
Hong, Choong Seon [2 ]
Zha, Yiyong [3 ]
Zhang, Yunfei [3 ]
Liu, Xin [3 ]
Han, Zhu [1 ,4 ]
机构
[1] Univ Houston, Dept Elect & Comp Engn, Houston, TX 77004 USA
[2] Kyung Hee Univ, Dept Comp Sci & Engn, Yongin 17104, Gyeonggi Do, South Korea
[3] Tencent Technol Co Ltd, Shenzhen 518054, Guangdong, Peoples R China
[4] Kyung Hee Univ, Dept Comp Sci & Engn, Seoul 446701, South Korea
关键词
Servers; Collaborative work; Machine learning; Data models; Stochastic processes; Computational modeling; Training; Federated learning; multi-access edge computing; stochastic variance reduce gradient;
D O I
10.1109/TVT.2021.3089431
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recently, a novel machine learning technique, federated learning, attracts ever-increasing interests from academia to industry. The main idea of federated learning is to collaboratively train a global optimal machine learning model among all the participants. During the process of parameter updating, the communication cost of the system or network can be extremely huge with a large number of iterations and participants. Although the edge computing paradigm can decrease the latency to a certain extent, how to obtain further delay reduction is still a challenge. Therefore, to address the problem, we firstly model the corresponding problem into a finite-sum optimization problem. Then, we propose a federated stochastic variance reduced gradient based method to decrease the number of iterations between the participants and server from the system perspective, and guarantee the accuracy at the same time. Meanwhile, the corresponding convergence analysis is provided. Finally, we test our proposed method on the linear regression problem and the logistic regression problem. The simulation results show that our proposed method can reduce the communication cost significantly compared with general stochastic gradient descent based federated learning.
引用
收藏
页码:7300 / 7304
页数:5
相关论文
共 15 条
[1]   Convergence of Distributed Stochastic Variance Reduced Methods Without Sampling Extra Data [J].
Cen, Shicong ;
Zhang, Huishuai ;
Chi, Yuejie ;
Chen, Wei ;
Liu, Tie-Yan .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 :3976-3989
[2]   Matching-Theory-Based Low-Latency Scheme for Multitask Federated Learning in MEC Networks [J].
Chen, Dawei ;
Hong, Choong Seon ;
Wang, Li ;
Zha, Yiyong ;
Zhang, Yunfei ;
Liu, Xin ;
Han, Zhu .
IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (14) :11415-11426
[3]   Edge Computing Resources Reservation in Vehicular Networks: A Meta-Learning Approach [J].
Chen, Dawei ;
Liu, Yin-Chen ;
Kim, BaekGyu ;
Xie, Jiang ;
Hong, Choong Seon ;
Han, Zhu .
IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2020, 69 (05) :5634-5646
[4]  
Johnson R, 2013, P ADV NEUR INF PROC, P315
[5]   Federated Learning for Edge Networks: Resource Optimization and Incentive Mechanism [J].
Khan, Latif U. ;
Pandey, Shashi Raj ;
Tran, Nguyen H. ;
Saad, Walid ;
Han, Zhu ;
Nguyen, Minh N. H. ;
Hong, Choong Seon .
IEEE COMMUNICATIONS MAGAZINE, 2020, 58 (10) :88-93
[6]  
Khan LU, 2020, 2020 34TH INTERNATIONAL CONFERENCE ON INFORMATION NETWORKING (ICOIN 2020), P453, DOI [10.1109/icoin48656.2020.9016505, 10.1109/ICOIN48656.2020.9016505]
[7]  
Konecny J., 2016, ARXIV161005492
[8]  
Konecny J., 2016, ARXIV161002527
[9]   A Crowdsourcing Framework for On-Device Federated Learning [J].
Pandey, Shashi Raj ;
Tran, Nguyen H. ;
Bennis, Mehdi ;
Tun, Yan Kyaw ;
Manzoor, Aunas ;
Hong, Choong Seon .
IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2020, 19 (05) :3241-3256
[10]  
Reinsel D., 2018, DATA AGE 2025 DIGITI, Patent No. 44413318