共 26 条
Topology Learning for Heterogeneous Decentralized Federated Learning Over Unreliable D2D Networks
被引:1
作者:
Wu, Zheshun
[1
]
Xu, Zenglin
[1
]
Zeng, Dun
[2
]
Li, Junfan
[1
]
Liu, Jie
[1
]
机构:
[1] Harbin Inst Technol Shenzhen, Sch Comp Sci & Technol, Shenzhen 518055, Peoples R China
[2] Univ Elect Sci & Technol China, Dept Comp Sci & Engn, Chengdu 611731, Peoples R China
关键词:
Device-to-device communication;
Convergence;
Training;
Topology;
Federated learning;
Network topology;
Stochastic processes;
D2D networks;
data heterogeneity;
decentralized federated learning;
topology learning;
unreliable links;
SELECTION;
D O I:
10.1109/TVT.2024.3376708
中图分类号:
TM [电工技术];
TN [电子技术、通信技术];
学科分类号:
0808 ;
0809 ;
摘要:
With the proliferation of intelligent mobile devices in wireless device-to-device (D2D) networks, decentralized federated learning (DFL) has attracted significant interest. Compared to centralized federated learning (CFL), DFL mitigates the risk of central server failures due to communication bottlenecks. However, DFL faces several challenges, such as the severe heterogeneity of data distributions in diverse environments, and the transmission outages and package errors caused by the adoption of the User Datagram Protocol (UDP) in D2D networks. These challenges often degrade the convergence of training DFL models. To address these challenges, we conduct a thorough theoretical convergence analysis for DFL and derive a convergence bound. By defining a novel quantity named unreliable links-aware neighborhood discrepancy in this convergence bound, we formulate a tractable optimization objective, and develop a novel Topology Learning method considering the Representation Discrepancy and Unreliable Links in DFL, named ToLRDUL. Intensive experiments under both feature skew and label skew settings have validated the effectiveness of our proposed method, demonstrating improved convergence speed and test accuracy, consistent with our theoretical findings.
引用
收藏
页码:12201 / 12206
页数:6
相关论文