FedCiR: Client-Invariant Representation Learning for Federated Non-IID Features

被引:1
|
作者
Li, Zijian [1 ]
Lin, Zehong [1 ]
Shao, Jiawei [1 ]
Mao, Yuyi [2 ]
Zhang, Jun [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Hong Kong, Peoples R China
[2] Hong Kong Polytech Univ, Dept Elect & Elect Engn, Hong Kong, Peoples R China
关键词
Training; Representation learning; Feature extraction; Distributed databases; Data models; Mutual information; Servers; federated learning (FL); non-independent and identically distributed (non-IID) data; edge intelligence;
D O I
10.1109/TMC.2024.3376697
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) is a distributed learning paradigm that maximizes the potential of data-driven models for edge devices without sharing their raw data. However, devices often have non-independent and identically distributed (non-IID) data, meaning their local data distributions can vary significantly. The heterogeneity in input data distributions across devices, commonly referred to as the feature shift problem, can adversely impact the training convergence and accuracy of the global model. To analyze the intrinsic causes of the feature shift problem, we develop a generalization error bound in FL, which motivates us to propose FedCiR, a client-invariant representation learning framework that enables clients to extract informative and client-invariant features. Specifically, we improve the mutual information term between representations and labels to encourage representations to carry essential classification knowledge, and diminish the mutual information term between the client set and representations conditioned on labels to promote representations of clients to be client-invariant. We further incorporate two regularizers into the FL framework to bound the mutual information terms with an approximate global representation distribution to compensate for the absence of the ground-truth global representation distribution, thus achieving informative and client-invariant feature extraction. To achieve global representation distribution approximation, we propose a data-free mechanism performed by the server without compromising privacy. Extensive experiments demonstrate the effectiveness of our approach in achieving client-invariant representation learning and solving the data heterogeneity issue.
引用
收藏
页码:10509 / 10522
页数:14
相关论文
共 50 条
  • [1] Client Selection for Federated Learning With Non-IID Data in Mobile Edge Computing
    Zhang, Wenyu
    Wang, Xiumin
    Zhou, Pan
    Wu, Weiwei
    Zhang, Xinglin
    IEEE ACCESS, 2021, 9 : 24462 - 24474
  • [2] Federated Learning With Non-IID Data: A Survey
    Lu, Zili
    Pan, Heng
    Dai, Yueyue
    Si, Xueming
    Zhang, Yan
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (11): : 19188 - 19209
  • [3] Federated Learning With Taskonomy for Non-IID Data
    Jamali-Rad, Hadi
    Abdizadeh, Mohammad
    Singh, Anuj
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (11) : 8719 - 8730
  • [4] Feature Matching Data Synthesis for Non-IID Federated Learning
    Li, Zijian
    Sun, Yuchang
    Shao, Jiawei
    Mao, Yuyi
    Wang, Jessie Hui
    Zhang, Jun
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 9352 - 9367
  • [5] Adaptive Federated Learning on Non-IID Data With Resource Constraint
    Zhang, Jie
    Guo, Song
    Qu, Zhihao
    Zeng, Deze
    Zhan, Yufeng
    Liu, Qifeng
    Akerkar, Rajendra
    IEEE TRANSACTIONS ON COMPUTERS, 2022, 71 (07) : 1655 - 1667
  • [6] WSCC: A Weight-Similarity-Based Client Clustering Approach for Non-IID Federated Learning
    Tian, Pu
    Liao, Weixian
    Yu, Wei
    Blasch, Erik
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (20) : 20243 - 20256
  • [7] Federated Learning With Non-IID Data in Wireless Networks
    Zhao, Zhongyuan
    Feng, Chenyuan
    Hong, Wei
    Jiang, Jiamo
    Jia, Chao
    Quek, Tony Q. S.
    Peng, Mugen
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2022, 21 (03) : 1927 - 1942
  • [8] Mitigating Update Conflict in Non-IID Federated Learning via Orthogonal Class Gradients
    Guo, Siyang
    Guo, Yaming
    Zhang, Hui
    Wang, Junbo
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2025, 24 (04) : 2967 - 2978
  • [9] FedPD: A Federated Learning Framework With Adaptivity to Non-IID Data
    Zhang, Xinwei
    Hong, Mingyi
    Dhople, Sairaj
    Yin, Wotao
    Liu, Yang
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 (69) : 6055 - 6070
  • [10] A Study of Enhancing Federated Learning on Non-IID Data with Server Learning
    Mai V.S.
    La R.J.
    Zhang T.
    IEEE Transactions on Artificial Intelligence, 2024, 5 (11): : 1 - 15