A two-phase half-async method for heterogeneity-aware federated learning

被引:3
|
作者
Ma, Tianyi [1 ,2 ]
Mao, Bingcheng [1 ,2 ]
Chen, Ming [1 ,2 ]
机构
[1] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou, Zhejiang, Peoples R China
[2] Hithink RoyalFlush Informat Network Co Ltd, Hangzhou, Zhejiang, Peoples R China
关键词
Federated learning; Federated optimization; Non-IID data;
D O I
10.1016/j.neucom.2021.08.146
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) is a distributed machine learning paradigm that allows training models on decentralized data over large-scale edge/mobile devices without collecting raw data. However, existing methods are still far from efficient and stable under extreme statistical and environmental heterogeneity. In this work, we propose FedHA (Federated Heterogeineity Awareness), a novel half-async algorithm which simultaneously incorporates the merits of asynchronous and synchronous methods. It separates the training into two phases by estimating the consistency of optimization directions of collected local models. It applies different strategies to facilitate fast and stable training, namely model selection, adaptive local epoch, and heterogeneity weighted aggregation in these phases. We provide theoretical convergence and communication guarantees on both convex and non-convex problems without introducing extra assumptions. In the first phase (the consistent phase), the convergence rate of FedHA is O (1/e(T)), which is faster than existing methods while reducing communication. In the second phase (inconsistent phase), FedHA retains the best-known results in convergence (O(1/T)) and communication (O(1/c)). We validate our proposed algorithm on different tasks with both IID (Independently and Identically Distributed) and non-IID data, and results show that our algorithm is efficient, stable, and flexible under the twofold heterogeneity using the proposed strategies. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页码:134 / 154
页数:21
相关论文
共 50 条
  • [1] Heterogeneity-aware fair federated learning
    Li, Xiaoli
    Zhao, Siran
    Chen, Chuan
    Zheng, Zibin
    INFORMATION SCIENCES, 2023, 619 : 968 - 986
  • [2] FLASH: Heterogeneity-Aware Federated Learning at Scale
    Yang, Chengxu
    Xu, Mengwei
    Wang, Qipeng
    Chen, Zhenpeng
    Huang, Kang
    Ma, Yun
    Bian, Kaigui
    Huang, Gang
    Liu, Yunxin
    Jin, Xin
    Liu, Xuanzhe
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (01) : 483 - 500
  • [3] SplitAVG: A Heterogeneity-Aware Federated Deep Learning Method for Medical Imaging
    Zhang, Miao
    Qu, Liangqiong
    Singh, Praveer
    Kalpathy-Cramer, Jayashree
    Rubin, Daniel L.
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2022, 26 (09) : 4635 - 4644
  • [4] HADFL: Heterogeneity-aware Decentralized Federated Learning Framework
    Cao, Jing
    Lian, Zirui
    Liu, Weihong
    Zhu, Zongwei
    Ji, Cheng
    2021 58TH ACM/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2021, : 1 - 6
  • [5] Data Heterogeneity-Aware Personalized Federated Learning for Diagnosis
    Lin, Huiyan
    Li, Heng
    Jin, Haojin
    Yu, Xiangyang
    Yu, Kuai
    Liang, Chenhao
    Fu, Huazhu
    Liu, Jiang
    OPHTHALMIC MEDICAL IMAGE ANALYSIS, OMIA 2024, 2025, 15188 : 53 - 62
  • [6] Federated Learning With Heterogeneity-Aware Probabilistic Synchronous Parallel on Edge
    Zhao, Jianxin
    Han, Rui
    Yang, Yongkai
    Catterall, Benjamin
    Liu, Chi Harold
    Chen, Lydia Y.
    Mortier, Richard
    Crowcroft, Jon
    Wang, Liang
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2022, 15 (02) : 614 - 626
  • [7] HARMONY: Heterogeneity-Aware Hierarchical Management for Federated Learning System
    Tian, Chunlin
    Li, Li
    Shi, Zhan
    Wang, Jun
    Xu, ChengZhong
    2022 55TH ANNUAL IEEE/ACM INTERNATIONAL SYMPOSIUM ON MICROARCHITECTURE (MICRO), 2022, : 631 - 645
  • [8] Resource and Heterogeneity-aware Clients Eligibility Protocol in Federated Learning
    Asad, Muhammad
    Otoum, Safa
    Shaukat, Saima
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 1140 - 1145
  • [9] Helios: Heterogeneity-Aware Federated Learning with Dynamically Balanced Collaboration
    Xu, Zirui
    Yu, Fuxun
    Xiong, Jinjun
    Chen, Xiang
    2021 58TH ACM/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2021, : 997 - 1002
  • [10] AutoFL: Enabling Heterogeneity-Aware Energy Efficient Federated Learning
    Kim, Young Geun
    Wu, Carole-Jean
    PROCEEDINGS OF 54TH ANNUAL IEEE/ACM INTERNATIONAL SYMPOSIUM ON MICROARCHITECTURE, MICRO 2021, 2021, : 183 - 198