ns3-fl: Simulating Federated Learning with ns-3

被引:6
作者
Ekaireb, Emily [1 ]
Yu, Xiaofan [1 ]
Ergun, Kazim [1 ]
Zhao, Quanling [1 ]
Lee, Kai [1 ]
Huzaifa, Muhammad [1 ]
机构
[1] Univ Calif San Diego, La Jolla, CA 92093 USA
来源
PROCEEDING OF THE 2022 WORKSHOP ON NS-3, WNS3 2022 | 2022年
基金
美国国家科学基金会;
关键词
federated learning; ns-3; network simulation;
D O I
10.1145/3532577.3532591
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In recent years, there has been a spike in interest in the field of federated learning (FL). As a result, an increasing number of federated learning algorithms have been developed. Large-scale deployments to validate these algorithms are often not feasible, resulting in a need for simulation tools which closely emulate real deployment conditions. Existing federated learning simulators lack complex network settings, and instead focus on data and algorithmic development. ns-3 is a discrete event network simulator, which has a plethora of models to represent network components and can simulate complex networking scenarios. In this paper, we present ns3-fl, which is a tool that connects an existing FL simulator, flsim, with ns-3 to produce a federated learning simulator that considers data, algorithm, and network. We first discuss the learning, network and power models used to develop our tool. We then present an overview of our implementation, including the Client/Server ns-3 applications and interprocess communication protocols. A real Raspberry Pi-based deployment is setup to validate our tool. Finally, we perform a simulation emulating FL training on 40 clients throughout the UCSD campus and analyze the performance of our tool, in terms of real clock execution time for various FL rounds.
引用
收藏
页码:97 / 104
页数:8
相关论文
共 27 条
  • [1] Carroll Aaron, 2010, 2010 USENIX ANN TECH
  • [2] Convergence Time Optimization for Federated Learning Over Wireless Networks
    Chen, Mingzhe
    Poor, H. Vincent
    Saad, Walid
    Cui, Shuguang
    [J]. IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2021, 20 (04) : 2457 - 2471
  • [3] Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation
    Chen, Yang
    Sun, Xiaoyan
    Jin, Yaochu
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (10) : 4229 - 4238
  • [4] Asynchronous Online Federated Learning for Edge Devices with Non-IID Data
    Chen, Yujing
    Ning, Yue
    Slawski, Martin
    Rangwala, Huzefa
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 15 - 24
  • [5] Towards asynchronous federated learning for heterogeneous edge-powered internet of things
    Chen, Zheyi
    Liao, Weixian
    Hua, Kun
    Lu, Chao
    Yu, Wei
    [J]. DIGITAL COMMUNICATIONS AND NETWORKS, 2021, 7 (03) : 317 - 326
  • [6] Deng L., 2012, IEEE signal processing magazine, V29, P141, DOI [DOI 10.1109/MSP.2012.2211477, 10.1109/MSP.2012.2211477]
  • [7] Gawlowicz P, 2018, Arxiv, DOI arXiv:1810.03943
  • [8] Time Efficient Federated Learning with Semi-asynchronous Communication
    Hao, Jiangshan
    Zhao, Yanchao
    Zhang, Jiale
    [J]. 2020 IEEE 26TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS (ICPADS), 2020, : 156 - 163
  • [9] Hao Yin, 2020, WNS3 2020: Proceedings of the 2020 Workshop on ns-3, P57, DOI 10.1145/3389400.3389404
  • [10] He CY, 2020, Arxiv, DOI arXiv:2007.13518