Coded Federated Learning for Communication-Efficient Edge Computing: A Survey

被引:4
作者
Zhang, Yiqian [1 ,2 ,3 ]
Gao, Tianli [1 ,2 ,3 ]
Li, Congduan [1 ,2 ,3 ]
Tan, Chee Wei [4 ]
机构
[1] Sun Yat Sen Univ, Sch Elect & Commun Engn, Shenzhen 518107, Peoples R China
[2] Guizhou Univ, State Key Lab Publ Big Data, Guiyang 550025, Peoples R China
[3] Sun Yat Sen Univ, Shenzhen Key Lab Nav & Commun Integrat, Shenzhen 518107, Peoples R China
[4] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639815, Singapore
来源
IEEE OPEN JOURNAL OF THE COMMUNICATIONS SOCIETY | 2024年 / 5卷
基金
美国国家科学基金会;
关键词
Encoding; Servers; Computational modeling; Training; Federated learning; Data models; Surveys; Coding; distributed machine learning; federated learning; distributed computing; edge computing; STRAGGLER MITIGATION; AGGREGATION; COMPUTATION;
D O I
10.1109/OJCOMS.2024.3423362
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In the era of artificial intelligence and big data, the demand for data processing has surged, leading to larger datasets and computation capability. Distributed machine learning (DML) has been introduced to address this challenge by distributing tasks among multiple workers, reducing the resources required for each worker. However, in distributed systems, the presence of slow machines, commonly known as stragglers, or failed links can lead to prolonged runtimes and diminished performance. This survey explores the application of coding techniques in DML and coded edge computing in the distributed system to enhance system speed, robustness, privacy, and more. Notably, the study delves into coding in Federated Learning (FL), a specialized distributed learning system. Coding involves introducing redundancy into the system and identifying multicast opportunities. There exists a tradeoff between computation and communication costs. The survey establishes that coding is a promising approach for building robust and secure distributed systems with low latency.
引用
收藏
页码:4098 / 4124
页数:27
相关论文
共 125 条
[1]   A Learning-Based Approach to Approximate Coded Computation [J].
Agrawal, Navneet ;
Qiu, Yuqin ;
Frey, Matthias ;
Bjelakovic, Igor ;
Maghsudi, Setareh ;
Stanczak, Slawomir ;
Zhu, Jingge .
2022 IEEE INFORMATION THEORY WORKSHOP (ITW), 2022, :600-605
[2]  
Alistarh D, 2017, ADV NEUR IN, V30
[3]   The applications of machine learning techniques in medical data processing based on distributed computing and the Internet of Things [J].
Aminizadeh, Sarina ;
Heidari, Arash ;
Toumaj, Shiva ;
Darbandi, Mehdi ;
Navimipour, Nima Jafari ;
Rezaei, Mahsa ;
Talebi, Samira ;
Azad, Poupak ;
Unal, Mehmet .
COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2023, 241
[4]   DIFFERENTIALLY PRIVATE CODED FEDERATED LINEAR REGRESSION [J].
Anand, Arjun ;
Dhakal, Sagar ;
Akdeniz, Mustafa ;
Edwards, Brandon ;
Himayat, Nageen .
2021 IEEE DATA SCIENCE AND LEARNING WORKSHOP (DSLW), 2021,
[5]  
[Anonymous], 2021, CYB LAW PEOPL REP CH
[6]   Randomized Polar Codes for Anytime Distributed Machine Learning [J].
Bartan, Burak ;
Pilanci, Mert .
IEEE JOURNAL ON SELECTED AREAS IN INFORMATION THEORY, 2023, 4 :393-404
[7]  
Beck MT, 2015, INT CONF INTELL NEXT, P38, DOI 10.1109/ICIN.2015.7073804
[8]   Stochastic Gradient Coding for Straggler Mitigation in Distributed Learning [J].
Bitar, Rawad ;
Wootters, Mary ;
El Rouayheb, Salim .
IEEE JOURNAL ON SELECTED AREAS IN INFORMATION THEORY, 2020, 1 (01) :277-291
[9]   Stochastic Gradient Coding for Flexible Straggler Mitigation in Distributed Learning [J].
Bitar, Rawad ;
Wootters, Mary ;
El Rouayheb, Salim .
2019 IEEE INFORMATION THEORY WORKSHOP (ITW), 2019, :394-398
[10]   Practical Secure Aggregation for Privacy-Preserving Machine Learning [J].
Bonawitz, Keith ;
Ivanov, Vladimir ;
Kreuter, Ben ;
Marcedone, Antonio ;
McMahan, H. Brendan ;
Patel, Sarvar ;
Ramage, Daniel ;
Segal, Aaron ;
Seth, Karn .
CCS'17: PROCEEDINGS OF THE 2017 ACM SIGSAC CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY, 2017, :1175-1191