Communication Topologies for Decentralized Federated Learning

被引:0
作者
Doetzer, Michael [1 ]
Mao, Yixin [1 ]
Diepold, Klaus [1 ]
机构
[1] Tech Univ Munich, Chair Data Proc, Munich, Germany
来源
2023 EIGHTH INTERNATIONAL CONFERENCE ON FOG AND MOBILE EDGE COMPUTING, FMEC | 2023年
关键词
Federated learning; clustering applications; network topology;
D O I
10.1109/FMEC59375.2023.10306161
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Conventional federated learning aims at enabling clients to contribute to a global training process while keeping their own data local. However, as the number of devices on the network increases, it can no longer be assumed that there is a central entity with sufficient bandwidth or computing resources to handle the volume of requests. Hence, in this paper, we consider implementing federated learning with different topologies in a network without a central entity. We compare hierarchical and decentralized topologies with varying degrees of interconnectivity. In our experiments, we use 50 clients with small CNNs and MNIST, FashinMNIST or Cifar10 datasets. Our results show that models in a decentralized network can achieve similar performances as models in a centralized network if the topology is carefully chosen. We relate the accuracy of the models to the estimated communication overhead by considering the number of communication connections required for a given topology. These results indicate that cluster topologies can leverage similarities of data distributions and mitigate the communication effort without sacrificing performance. In addition, we present a simple method to estimate the information transfer performance of a topology without empirical testing.
引用
收藏
页码:232 / 238
页数:7
相关论文
共 30 条
[1]  
Bellet A, 2018, PR MACH LEARN RES, V84
[2]   Federated learning with hierarchical clustering of local updates to improve training on non-IID data [J].
Briggs, Christopher ;
Fan, Zhong ;
Andras, Peter .
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
[3]   An EMD-Based Adaptive Client Selection Algorithm for Federated Learning in Heterogeneous Data Scenarios [J].
Chen, Aiguo ;
Fu, Yang ;
Sha, Zexin ;
Lu, Guoming .
FRONTIERS IN PLANT SCIENCE, 2022, 13
[4]  
Ester M., 1996, P 2 INT C KNOWL DISC, P226, DOI DOI 10.5555/3001460.3001507
[5]   Efficient and Privacy-Enhanced Federated Learning for Industrial Artificial Intelligence [J].
Hao, Meng ;
Li, Hongwei ;
Luo, Xizhao ;
Xu, Guowen ;
Yang, Haomiao ;
Liu, Sen .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2020, 16 (10) :6532-6542
[6]  
He C., 2020, P 2020 INT WORKSH SE
[7]   A new statement on the theory of quadratic forms of infinite alterations. [J].
Hellinger, E .
JOURNAL FUR DIE REINE UND ANGEWANDTE MATHEMATIK, 1909, 136 (1/4) :210-271
[8]  
Hu CH, 2019, Arxiv, DOI arXiv:1908.07782
[9]   Ring-Star: A Sparse Topology for Faster Model Averaging in Decentralized Parallel SGD [J].
Jameel, Mohsan ;
Grabocka, Josif ;
Arif, Mofassir ul Islam ;
Schmidt-Thieme, Lars .
MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT I, 2020, 1167 :333-341
[10]   Impact of Network Topology on the Convergence of Decentralized Federated Learning Systems [J].
Kavalionak, Hanna ;
Carlini, Emanuele ;
Dazzi, Patrizio ;
Ferrucci, Luca ;
Mordacchini, Matteo ;
Coppola, Massimo .
26TH IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS (IEEE ISCC 2021), 2021,