Two-way Delayed Updates with Model Similarity in Communication-Efficient Federated Learning

被引:0
作者
Mao, Yingchi [1 ,2 ]
Wang, Zibo [2 ]
Wu, Jun [2 ]
Shen, Lijuan [2 ]
Xu, Shufang [1 ,2 ]
Wu, Jie [3 ]
机构
[1] Hohai Univ, Key Lab Water Big Data Technol, Minist Water Resources, Nanjing, Peoples R China
[2] Hohai Univ, Sch Comp & Informat, Nanjing, Peoples R China
[3] Temple Univ, Ctr Networked Comp, Philadelphia, PA USA
来源
2023 19TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING, MSN 2023 | 2023年
关键词
federated learning; data heterogeneity; communication efficiency optimization; communication frequency;
D O I
10.1109/MSN60784.2023.00080
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The great achievement of IoT and the wide use of edge devices have brought explosive growth in data. The quality and scale of data determine the performances of machine learning models. Federated learning has attracted widespread attention for its ability to use isolated data and protect data privacy. Models can represent excellent generalization capabilities through federated training. However, the large number of devices and complex models involved in federated training exacerbate the communication costs and degrade the performance of the global model. Although existing approaches can reduce communication costs, they ignore the degradation of global model accuracy in a heterogeneous environment. To alleviate the huge communication costs in federated learning, this paper focuses on reducing upstream and downstream communication frequency while ensuring global model accuracy. We propose a Two-way Delayed Updates method with Model Similarity in Communication-Efficient Federated Learning (FedTDMS). FedTDMS employs personalized local computation to improve global model accuracy on heterogeneous data. Combining local update relevance check and global model compensation, FedTDMS reduces the communication frequency in Federated Learning. We conduct experiments on the MNIST-FL and CIFAR-10-FL datasets. Results show that FedTDMS can greatly optimize communication efficiency while maintaining good global model accuracy.
引用
收藏
页码:520 / 527
页数:8
相关论文
共 50 条
  • [31] FedCS: Communication-Efficient Federated Learning with Compressive Sensing
    Liu, Ye
    Chang, Shan
    Liu, Yiqi
    2022 IEEE 28TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS, ICPADS, 2022, : 17 - 24
  • [32] Communication-Efficient and Personalized Federated Lottery Ticket Learning
    Seo, Sejin
    Ko, Seung-Woo
    Park, Jihong
    Kim, Seong-Lyun
    Bennis, Mehdi
    SPAWC 2021: 2021 IEEE 22ND INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC 2021), 2020, : 581 - 585
  • [33] LotteryFL: Empower Edge Intelligence with Personalized and Communication-Efficient Federated Learning
    Li, Ang
    Sun, Jingwei
    Wang, Binghui
    Duan, Lin
    Li, Sicheng
    Chen, Yiran
    Li, Hai
    2021 ACM/IEEE 6TH SYMPOSIUM ON EDGE COMPUTING (SEC 2021), 2021, : 68 - 79
  • [34] AFedAvg: communication-efficient federated learning aggregation with adaptive communication frequency and gradient sparse
    Li, Yanbin
    He, Ziming
    Gu, Xingjian
    Xu, Huanliang
    Ren, Shougang
    JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE, 2024, 36 (01) : 47 - 69
  • [35] Communication-efficient federated learning via personalized filter pruning
    Min, Qi
    Luo, Fei
    Dong, Wenbo
    Gu, Chunhua
    Ding, Weichao
    INFORMATION SCIENCES, 2024, 678
  • [36] QSFL: Two-Level Communication-Efficient Federated Learning on Mobile Edge Devices
    Yi, Liping
    Wang, Gang
    Wang, Xiaofei
    Liu, Xiaoguang
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2024, 17 (06) : 4166 - 4182
  • [37] Communication-Efficient Federated Learning Algorithm Based on Event Triggering
    Gao H.
    Yang L.
    Zhu J.
    Zhang M.
    Wu Q.
    Dianzi Yu Xinxi Xuebao/Journal of Electronics and Information Technology, 2023, 45 (10): : 3710 - 3718
  • [38] Communication-Efficient Federated Learning for Network Traffic Anomaly Detection
    Cui, Xiao
    Han, Xiaohui
    Liu, Guangqi
    Zuo, Wenbo
    Wang, Zhiwen
    2023 19TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING, MSN 2023, 2023, : 398 - 405
  • [39] LaF: Lattice-Based and Communication-Efficient Federated Learning
    Xu, Peng
    Hu, Manqing
    Chen, Tianyang
    Wang, Wei
    Jin, Hai
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2022, 17 : 2483 - 2496
  • [40] VeriFL: Communication-Efficient and Fast Verifiable Aggregation for Federated Learning
    Guo, Xiaojie
    Liu, Zheli
    Li, Jin
    Gao, Jiqiang
    Hou, Boyu
    Dong, Changyu
    Baker, Thar
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2021, 16 : 1736 - 1751