Prototype-Decomposed Knowledge Distillation for Learning Generalized Federated Representation

被引:0
|
作者
Wu, Aming [1 ]
Yu, Jiaping [1 ]
Wang, Yuxuan [1 ]
Deng, Cheng [1 ]
机构
[1] Xidian Univ, Sch Elect Engn, Xian 710126, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Prototypes; Data models; Servers; Training; Feature extraction; Federated learning; Task analysis; Federated domain generalization; class prototypes; singular value decomposition; prototype decomposition; knowledge distillation;
D O I
10.1109/TMM.2024.3428352
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) enables distributed clients to collaboratively learn a global model, suggesting its potential for use in improving data privacy in machine learning. However, although FL has made many advances, its performance usually suffers from degradation due to the impact of domain shift when the trained models are applied to unseen domains. To enhance the model's generalization ability, we focus on solving federated domain generalization, which aims to properly generalize a federated model trained based on multiple source domains belonging to different distributions to an unseen target domain. A novel approach, namely Prototype-Decomposed Knowledge Distillation (PDKD), is proposed herein. Concretely, we first aggregate the local class prototypes that are learned from different clients. Subsequently, Singular Value Decomposition (SVD) is employed to decompose the local prototypes to obtain discriminative and generalized global prototypes that contain rich category-related information. Finally, the global prototypes are sent back to all clients. We exploit knowledge distillation to encourage local client models to distill generalized knowledge from the global prototypes, which boosts the generalization ability. Extensive experiments on multiple datasets demonstrate the effectiveness of our method. In particular, when implemented on the Office dataset, our method outperforms FedAvg by around 13.5%, which shows that our method is instrumental in ameliorating the generalization ability of federated models.
引用
收藏
页码:10991 / 11002
页数:12
相关论文
共 50 条
  • [1] Prototype Similarity Distillation for Communication-Efficient Federated Unsupervised Representation Learning
    Zhang, Chen
    Xie, Yu
    Chen, Tingbin
    Mao, Wenjie
    Yu, Bin
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (11) : 6865 - 6876
  • [2] Federated Split Learning via Mutual Knowledge Distillation
    Luo, Linjun
    Zhang, Xinglin
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (03): : 2729 - 2741
  • [3] Heterogeneous Federated Learning Framework for IIoT Based on Selective Knowledge Distillation
    Guo, Sheng
    Chen, Hui
    Liu, Yang
    Yang, Chengyi
    Li, Zengxiang
    Jin, Cheng Hao
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2025, 21 (02) : 1078 - 1089
  • [4] Global prototype distillation for heterogeneous federated learning
    Wu, Shu
    Chen, Jindou
    Nie, Xueli
    Wang, Yong
    Zhou, Xiancun
    Lu, Linlin
    Peng, Wei
    Nie, Yao
    Menhaj, Waseef
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [5] A Prototype-Based Knowledge Distillation Framework for Heterogeneous Federated Learning
    Lyu, Feng
    Tang, Cheng
    Deng, Yongheng
    Liu, Tong
    Zhang, Yongmin
    Zhang, Yaoxue
    2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 37 - 47
  • [6] FedPA: Generator-Based Heterogeneous Federated Prototype Adversarial Learning
    Jiang, Lei
    Wang, Xiaoding
    Yang, Xu
    Shu, Jiwu
    Lin, Hui
    Yi, Xun
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2025, 22 (02) : 939 - 949
  • [7] Learning Critically: Selective Self-Distillation in Federated Learning on Non-IID Data
    He, Yuting
    Chen, Yiqiang
    Yang, XiaoDong
    Yu, Hanchao
    Huang, Yi-Hua
    Gu, Yang
    IEEE TRANSACTIONS ON BIG DATA, 2024, 10 (06) : 789 - 800
  • [8] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [9] WHEN FEDERATED LEARNING MEETS KNOWLEDGE DISTILLATION
    Pang, Xiaoyi
    Hu, Jiahui
    Sun, Peng
    Ren, Ju
    Wang, Zhibo
    IEEE WIRELESS COMMUNICATIONS, 2024, 31 (05) : 208 - 214
  • [10] FedDKD: Federated learning with decentralized knowledge distillation
    Xinjia Li
    Boyu Chen
    Wenlian Lu
    Applied Intelligence, 2023, 53 : 18547 - 18563