Heterogeneity Challenges of Federated Learning for Future Wireless Communication Networks

被引:0
作者
Barona Lopez, Lorena Isabel [1 ]
Saltos, Thomas Borja [1 ,2 ]
机构
[1] Escuela Politec Nacl, Dept Informat & Ciencias Comp, Artificial Intelligence & Comp Vis Res Lab, Quito 170525, Ecuador
[2] Univ Estatal Bolivar, Fac Agr Nat Resources & Environm, Guaranda 020150, Ecuador
关键词
behavioral heterogeneity; federated learning; mobile communications; statistical heterogeneity; system heterogeneity; wireless communications;
D O I
10.3390/jsan14020037
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Two technologies of great interest in recent years-Artificial Intelligence (AI) and massive wireless communication networks-have found a significant point of convergence through Federated Learning (FL). Federated Learning is a Machine Learning (ML) technique that enables multiple participants to collaboratively train a model while keeping their data local. Several studies indicate that while improving performance metrics-such as accuracy, loss reduction, or computation time-is a primary goal, achieving this in real-world scenarios remains challenging. This difficulty arises due to various heterogeneity characteristics inherent to the wireless devices participating in the Federation. Heterogeneity in Federated Learning arises when participants contribute differently, leading to challenges in the model training process. Heterogeneity in Federated Learning may appear in architecture, statistics, and behavior. System heterogeneity arises from differences in device capabilities, including processing power, transmission speeds, availability, energy constraints, and network limitations, among others. Statistical heterogeneity occurs when participants contribute non-independent and non-identically distributed (non-IID) data. This situation can harm the global model instead of improving it, especially when the data are of poor quality or too scarce. The third type, behavioral heterogeneity, refers to cases where participants are unwilling to engage or expect rewards despite minimal effort. Given the growing research in this area, we present a summary of heterogeneity characteristics in Federated Learning to provide a broader perspective on this emerging technology. We also outline key challenges, opportunities, and future directions for Federated Learning. Finally, we conduct a simulation using the LEAF framework to illustrate the impact of heterogeneity in Federated Learning.
引用
收藏
页数:42
相关论文
共 49 条
  • [1] Abadi M, 2016, PROCEEDINGS OF OSDI'16: 12TH USENIX SYMPOSIUM ON OPERATING SYSTEMS DESIGN AND IMPLEMENTATION, P265
  • [2] Empirical Analysis of Federated Learning in Heterogeneous Environments
    Abdelmoniem, Ahmed M.
    Ho, Chen-Yu
    Papageorgiou, Pantelis
    Canini, Marco
    [J]. PROCEEDINGS OF THE 2022 2ND EUROPEAN WORKSHOP ON MACHINE LEARNING AND SYSTEMS (EUROMLSYS '22), 2022, : 1 - 9
  • [3] A Comprehensive Empirical Study of Heterogeneity in Federated Learning
    Abdelmoniem, Ahmed M. M.
    Ho, Chen-Yu
    Papageorgiou, Pantelis
    Canini, Marco
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (16) : 14071 - 14083
  • [4] Federated Learning for Cybersecurity: Concepts, Challenges, and Future Directions
    Alazab, Mamoun
    Priya, Swarna R. M.
    Parimala, M.
    Maddikunta, Praveen Kumar Reddy
    Gadekallu, Thippa Reddy
    Quoc-Viet Pham
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2022, 18 (05) : 3501 - 3509
  • [5] Federated Learning Over Wireless Networks: Challenges and Solutions
    Beitollahi, Mahdi
    Lu, Ning
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (16) : 14749 - 14763
  • [6] Ben Driss M, 2023, Arxiv, DOI arXiv:2312.04688
  • [7] Beutel DJ, 2022, Arxiv, DOI arXiv:2007.14390
  • [8] Caldas S, 2018, arXiv
  • [9] TiFL: A Tier-based Federated Learning System
    Chai, Zheng
    Ali, Ahsan
    Zawad, Syed
    Treux, Stacey
    Anwar, Ali
    Barcaldo, Nathalie
    Zhou, Yi
    Ludwig, Heiko
    Yan, Feng
    Cheng, Yue
    [J]. PROCEEDINGS OF THE 29TH INTERNATIONAL SYMPOSIUM ON HIGH-PERFORMANCE PARALLEL AND DISTRIBUTED COMPUTING, HPDC 2020, 2020, : 125 - 136
  • [10] Chai Z, 2019, PROCEEDINGS OF THE 2019 USENIX CONFERENCE ON OPERATIONAL MACHINE LEARNING, P19