Label-wise Distribution Adaptive Federated Learning on Non-IID Data

被引:1
作者
Chen, Baojian [1 ,2 ]
Li, Hongjia [1 ]
Guo, Lu [3 ]
Wang, Liming [1 ]
机构
[1] Chinese Acad Sci, Inst Informat Engn, Beijing 100093, Peoples R China
[2] Univ Chinese Acad Sci, Sch Cyber Secur, Beijing 101408, Peoples R China
[3] TravelSky Technol Ltd, Beijing 101318, Peoples R China
来源
2023 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC | 2023年
基金
国家重点研发计划;
关键词
D O I
10.1109/WCNC55385.2023.10119117
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) has recently drawn considerable attention, enabling multiple end devices to collaboratively learn global models without collecting device data. In reality, end devices can usually be distributed in non-correlated environments and generate non-IID data, which may lead to the Artificial Intelligence (AI) model weight divergence among devices and model accuracy degradation after aggregation. In this paper, to address the non-IID data problem for FL, we treat this problem among end devices as a distribution adaptation problem among multiple source domains and analyze the feasibility of feature augmentation, and then propose a novel method called Label-wisE Distribution Adaptive Federated Learning (LEDA-FL). First, to reduce the divergence in the label-wise feature space, we integrate the modified Conditional Variational AutoEncoder (CVAE) to align the label-wise feature distributions among clients. Second, we augment the label-wise features for FL clients to improve the FL performance (test accuracy and communication efficiency). Finally, we conduct an extensive experiment on five popular datasets, and the experimental results show that our proposed method improves the test accuracy of the global model (e.g., 6.2% test accuracy improvement on CIFAR100 compared to FedProx) and the communication efficiency of FL (e.g., about 60% reduction in communication cost on CIFAR100 compared to FedProx).
引用
收藏
页数:6
相关论文
共 17 条
[1]   A theory of learning from different domains [J].
Ben-David, Shai ;
Blitzer, John ;
Crammer, Koby ;
Kulesza, Alex ;
Pereira, Fernando ;
Vaughan, Jennifer Wortman .
MACHINE LEARNING, 2010, 79 (1-2) :151-175
[2]  
Gonog L, 2019, C IND ELECT APPL, P505, DOI [10.1109/ICIEA.2019.8833686, 10.1109/iciea.2019.8833686]
[3]  
Goodfellow I.J., 2009, Advances in Neural Information Processing Systems, V22, P646
[4]  
Ioffe S, 2015, PR MACH LEARN RES, V37, P448
[5]  
Li R, 2019, IEEE INT CONF BIG DA, P215, DOI 10.1109/BigData47090.2019.9006060
[6]  
Li T, 2020, Arxiv, DOI arXiv:1812.06127
[7]  
Li XX, 2021, Arxiv, DOI arXiv:2102.07623
[8]   A Collaborative Learning Framework via Federated Meta-Learning [J].
Lin, Sen ;
Yang, Guang ;
Zhang, Junshan .
2020 IEEE 40TH INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS), 2020, :289-299
[9]  
Mansour Y., 2008, Adv. Neural Inf. Process. Syst., V21
[10]  
McMahan HB, 2017, PR MACH LEARN RES, V54, P1273