Distributionally Robust Federated Learning for Mobile Edge Networks

被引:0
作者
Le, Long Tan [1 ]
Nguyen, Tung-Anh [1 ]
Nguyen, Tuan-Dung [2 ]
Tran, Nguyen H. [1 ]
Truong, Nguyen Binh [3 ]
Vo, Phuong L. [4 ,5 ]
Hung, Bui Thanh [6 ]
Le, Tuan Anh [7 ]
机构
[1] Univ Sydney, Sch Comp Sci, Darlington, NSW 2008, Australia
[2] Australian Natl Univ, Sch Comp, Canberra, ACT 2601, Australia
[3] Univ Glasgow, Sch Comp Sci, Glasgow City G12 8RZ, Scotland
[4] Int Univ, Sch Comp Sci & Engn, Ho Chi Minh City 700000, Vietnam
[5] Vietnam Natl Univ, Ho Chi Minh City 700000, Vietnam
[6] Ind Univ Ho Chi Minh City, Ho Chi Minh City 700000, Vietnam
[7] Thu Dau Mot Univ, Inst Engn & Technol, Binh Duong 820000, Vietnam
关键词
Federated learning; Distributionally robust optimization; Wasserstein distance;
D O I
10.1007/s11036-024-02316-w
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) revolutionizes data processing in mobile networks by enabling collaborative learning without data exchange. This not only reduces latency and enhances computational efficiency but also enables the system to adapt, learn and optimize the performance from the user's context in real-time. Nevertheless, FL faces challenges in training and generalization due to statistical heterogeneity, stemming from the diverse data nature across varying user contexts. To address these challenges, we propose WAFL\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\textsf {WAFL}$$\end{document}, a robust FL framework grounded in Wasserstein distributionally robust optimization, aimed at enhancing model generalization against all adversarial distributions within a predefined Wasserstein ambiguity set. We approach WAFL\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\textsf {WAFL}$$\end{document} by formulating it as an empirical surrogate risk minimization problem, which is then solved using a novel federated algorithm. Experimental results demonstrate that WAFL\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\textsf {WAFL}$$\end{document} outperforms other robust FL baselines in non-i.i.d settings, showcasing superior generalization and robustness to significant distribution shifts.
引用
收藏
页码:262 / 272
页数:11
相关论文
共 38 条
  • [1] ROBUST WASSERSTEIN PROFILE INFERENCE AND APPLICATIONS TO MACHINE LEARNING
    Blanchet, Jose
    Kang, Yang
    Murthy, Karthyek
    [J]. JOURNAL OF APPLIED PROBABILITY, 2019, 56 (03) : 830 - 857
  • [2] Caldas S., 2019, LEAF BENCHMARK FEDER, P1
  • [3] Chen RD, 2018, J MACH LEARN RES, V19
  • [4] Collins L, 2021, PR MACH LEARN RES, V139
  • [5] Deng Y., 2020, Advances in neural information processing systems, V33
  • [6] Dinh Canh T, 2020, ADV NEURAL INFORM PR, V33, P21394
  • [7] Du Wenxin, 2020, ARXIV
  • [8] Esfahani PM, 2017, ARXIV
  • [9] Fallah A., 2020, Advances in Neural Information Processing Systems, VVolume 33, P3557
  • [10] On the rate of convergence in Wasserstein distance of the empirical measure
    Fournier, Nicolas
    Guillin, Arnaud
    [J]. PROBABILITY THEORY AND RELATED FIELDS, 2015, 162 (3-4) : 707 - 738