Two-way Delayed Updates with Model Similarity in Communication-Efficient Federated Learning

被引:0
|
作者
Mao, Yingchi [1 ,2 ]
Wang, Zibo [2 ]
Wu, Jun [2 ]
Shen, Lijuan [2 ]
Xu, Shufang [1 ,2 ]
Wu, Jie [3 ]
机构
[1] Hohai Univ, Key Lab Water Big Data Technol, Minist Water Resources, Nanjing, Peoples R China
[2] Hohai Univ, Sch Comp & Informat, Nanjing, Peoples R China
[3] Temple Univ, Ctr Networked Comp, Philadelphia, PA USA
来源
2023 19TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING, MSN 2023 | 2023年
关键词
federated learning; data heterogeneity; communication efficiency optimization; communication frequency;
D O I
10.1109/MSN60784.2023.00080
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The great achievement of IoT and the wide use of edge devices have brought explosive growth in data. The quality and scale of data determine the performances of machine learning models. Federated learning has attracted widespread attention for its ability to use isolated data and protect data privacy. Models can represent excellent generalization capabilities through federated training. However, the large number of devices and complex models involved in federated training exacerbate the communication costs and degrade the performance of the global model. Although existing approaches can reduce communication costs, they ignore the degradation of global model accuracy in a heterogeneous environment. To alleviate the huge communication costs in federated learning, this paper focuses on reducing upstream and downstream communication frequency while ensuring global model accuracy. We propose a Two-way Delayed Updates method with Model Similarity in Communication-Efficient Federated Learning (FedTDMS). FedTDMS employs personalized local computation to improve global model accuracy on heterogeneous data. Combining local update relevance check and global model compensation, FedTDMS reduces the communication frequency in Federated Learning. We conduct experiments on the MNIST-FL and CIFAR-10-FL datasets. Results show that FedTDMS can greatly optimize communication efficiency while maintaining good global model accuracy.
引用
收藏
页码:520 / 527
页数:8
相关论文
共 50 条
  • [1] ADAPTIVE QUANTIZATION OF MODEL UPDATES FOR COMMUNICATION-EFFICIENT FEDERATED LEARNING
    Jhunjhunwala, Divyansh
    Gadhikar, Advait
    Joshi, Gauri
    Eldar, Yonina C.
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3110 - 3114
  • [2] Selective Updates and Adaptive Masking for Communication-Efficient Federated Learning
    Herzog, Alexander
    Southam, Robbie
    Belarbi, Othmane
    Anwar, Saif
    Bullo, Marcello
    Carnelli, Pietro
    Khan, Aftab
    IEEE TRANSACTIONS ON GREEN COMMUNICATIONS AND NETWORKING, 2024, 8 (02): : 852 - 864
  • [3] Mix2SFL: Two-Way Mixup for Scalable, Accurate, and Communication-Efficient Split Federated Learning
    Oh, Seungeun
    Nam, Hyelin
    Park, Jihong
    Vepakomma, Praneeth
    Raskar, Ramesh
    Bennis, Mehdi
    Kim, Seong-Lyun
    IEEE TRANSACTIONS ON BIG DATA, 2024, 10 (03) : 238 - 248
  • [4] Communication-efficient federated learning
    Chen, Mingzhe
    Shlezinger, Nir
    Poor, H. Vincent
    Eldar, Yonina C.
    Cui, Shuguang
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2021, 118 (17)
  • [5] Communication-Efficient Distributed Optimization for Sparse Learning via Two-Way Truncation
    Ren, Jineng
    Li, Xingguo
    Haupt, Jarvis
    2017 IEEE 7TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING (CAMSAP), 2017,
  • [6] FedADP: Communication-Efficient by Model Pruning for Federated Learning
    Liu, Haiyang
    Shi, Yuliang
    Su, Zhiyuan
    Zhang, Kun
    Wang, Xinjun
    Yan, Zhongmin
    Kong, Fanyu
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 3093 - 3098
  • [7] Prototype Similarity Distillation for Communication-Efficient Federated Unsupervised Representation Learning
    Zhang, Chen
    Xie, Yu
    Chen, Tingbin
    Mao, Wenjie
    Yu, Bin
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (11) : 6865 - 6876
  • [8] Communication-Efficient Vertical Federated Learning
    Khan, Afsana
    ten Thij, Marijn
    Wilbik, Anna
    ALGORITHMS, 2022, 15 (08)
  • [9] Communication-Efficient Adaptive Federated Learning
    Wang, Yujia
    Lin, Lu
    Chen, Jinghui
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [10] Communication-efficient clustered federated learning via model distance
    Zhang, Mao
    Zhang, Tie
    Cheng, Yifei
    Bao, Changcun
    Cao, Haoyu
    Jiang, Deqiang
    Xu, Linli
    MACHINE LEARNING, 2024, 113 (06) : 3869 - 3888