Fast Server Learning Rate Tuning for Coded Federated Dropout

被引:1
作者
Verardo, Giacomo [1 ]
Barreira, Daniel [1 ]
Chiesa, Marco [1 ]
Kostic, Dejan [1 ]
Maguire, Gerald Q., Jr. [1 ]
机构
[1] KTH Royal Inst Technol, Stockholm, Sweden
来源
TRUSTWORTHY FEDERATED LEARNING, FL 2022 | 2023年 / 13448卷
基金
瑞典研究理事会;
关键词
Federated Learning; Hyper-parameters tuning; Coding Theory;
D O I
10.1007/978-3-031-28996-5_7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In Federated Learning (FL), clients with low computational power train a common machine model by exchanging parameters via updates instead of transmitting potentially private data. Federated Dropout (FD) is a technique that improves the communication efficiency of a FL session by selecting a subset of model parameters to be updated in each training round. However, compared to standard FL, FD produces considerably lower accuracy and faces a longer convergence time. In this chapter, we leverage coding theory to enhance FD by allowing different sub-models to be used at each client. We also show that by carefully tuning the server learning rate hyper-parameter, we can achieve higher training speed while also reaching up to the same final accuracy as the no dropout case. Evaluations on the EMNIST dataset show that our mechanism achieves 99.6% of the final accuracy of the no dropout case while requiring 2.43x less bandwidth to achieve this level of accuracy.
引用
收藏
页码:84 / 99
页数:16
相关论文
共 28 条
[1]  
Abadi M, 2016, PROCEEDINGS OF OSDI'16: 12TH USENIX SYMPOSIUM ON OPERATING SYSTEMS DESIGN AND IMPLEMENTATION, P265
[2]  
Alistarh D., 2018, The Convergence of Sparsified Gradient Methods (NIPS'18)
[3]  
Alistarh D, 2017, ADV NEUR IN, V30
[4]  
Bouacida N., 2020, ADAPTIVE FEDERATED D
[5]   A NEW TABLE OF CONSTANT WEIGHT CODES [J].
BROUWER, AE ;
SHEARER, JB ;
SLOANE, NJA ;
SMITH, WD .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1990, 36 (06) :1334-1380
[6]  
Caldas S, 2019, ARXIV181201097CSSTAT
[7]  
Caldas S, 2019, Arxiv, DOI [arXiv:1812.07210, DOI 10.48550/ARXIV.1812.07210]
[8]  
Chollet F, 2015, Keras
[9]  
Cohen G., 2017, CORR
[10]  
Ericcson, 2021, MOB SUBSCR SHIFT 5G