Prototype-Decomposed Knowledge Distillation for Learning Generalized Federated Representation

被引:0
作者
Wu, Aming [1 ]
Yu, Jiaping [1 ]
Wang, Yuxuan [1 ]
Deng, Cheng [1 ]
机构
[1] Xidian Univ, Sch Elect Engn, Xian 710126, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Prototypes; Data models; Servers; Training; Feature extraction; Federated learning; Task analysis; Federated domain generalization; class prototypes; singular value decomposition; prototype decomposition; knowledge distillation;
D O I
10.1109/TMM.2024.3428352
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) enables distributed clients to collaboratively learn a global model, suggesting its potential for use in improving data privacy in machine learning. However, although FL has made many advances, its performance usually suffers from degradation due to the impact of domain shift when the trained models are applied to unseen domains. To enhance the model's generalization ability, we focus on solving federated domain generalization, which aims to properly generalize a federated model trained based on multiple source domains belonging to different distributions to an unseen target domain. A novel approach, namely Prototype-Decomposed Knowledge Distillation (PDKD), is proposed herein. Concretely, we first aggregate the local class prototypes that are learned from different clients. Subsequently, Singular Value Decomposition (SVD) is employed to decompose the local prototypes to obtain discriminative and generalized global prototypes that contain rich category-related information. Finally, the global prototypes are sent back to all clients. We exploit knowledge distillation to encourage local client models to distill generalized knowledge from the global prototypes, which boosts the generalization ability. Extensive experiments on multiple datasets demonstrate the effectiveness of our method. In particular, when implemented on the Office dataset, our method outperforms FedAvg by around 13.5%, which shows that our method is instrumental in ameliorating the generalization ability of federated models.
引用
收藏
页码:10991 / 11002
页数:12
相关论文
共 50 条
[21]   Adapter-Based Selective Knowledge Distillation for Federated Multi-Domain Meeting Summarization [J].
Feng, Xiachong ;
Feng, Xiaocheng ;
Du, Xiyuan ;
Kan, Min-Yen ;
Qin, Bing .
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 :3694-3708
[22]   Poster: AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation [J].
Mohammed, Malik Naik ;
Zhang, Xinyue ;
Valero, Maria ;
Xie, Ying .
2023 IEEE/ACM CONFERENCE ON CONNECTED HEALTH: APPLICATIONS, SYSTEMS AND ENGINEERING TECHNOLOGIES, CHASE, 2023, :207-208
[23]   Resource-Aware Knowledge Distillation for Federated Learning [J].
Chen, Zheyi ;
Tian, Pu ;
Liao, Weixian ;
Chen, Xuhui ;
Xu, Guobin ;
Yu, Wei .
IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2023, 11 (03) :706-719
[24]   FedX: Unsupervised Federated Learning with Cross Knowledge Distillation [J].
Han, Sungwon ;
Park, Sungwon ;
Wu, Fangzhao ;
Kim, Sundong ;
Wu, Chuhan ;
Xie, Xing ;
Cha, Meeyoung .
COMPUTER VISION - ECCV 2022, PT XXX, 2022, 13690 :691-707
[25]   Hierarchical Federated Learning in MEC Networks with Knowledge Distillation [J].
Tuan Dung Nguyen ;
Ngoc Anh Tong ;
Nguyent, Binh P. ;
Quoc Viet Hung Nguyen ;
Phi Le Nguyen ;
Thanh Trung Huynh .
2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024, 2024,
[26]   FedRDA: Representation Deviation Alignment in Heterogeneous Federated Learning [J].
Yao, Wenjie ;
Sun, Guanglu ;
Zhu, Suxia ;
Wang, Ruidong ;
Zhu, Xinzhong ;
Xu, HuiYing ;
Wei, Xiguang .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2025,
[27]   Flow-Based IoT Intrusion Detection via Improved Generative Federated Distillation Learning [J].
Li, Zeyu ;
Yao, Wenbin ;
Luo, Juanjuan ;
Huang, Zhibin .
IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (10) :14797-14811
[28]   Federated Class-Incremental Learning via Weighted Aggregation and Distillation [J].
Wu, Feng ;
Ziying Tan, Alysa ;
Feng, Siwei ;
Yu, Han ;
Deng, Tao ;
Zhao, Libang ;
Chen, Yuanlu .
IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (12) :22489-22503
[29]   Federated Learning With Long-Tailed Data via Representation Unification and Classifier Rectification [J].
Huang, Wenke ;
Liu, Yuxia ;
Ye, Mang ;
Chen, Jun ;
Du, Bo .
IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2024, 19 :5738-5750
[30]   Heterogeneous Defect Prediction Based on Federated Transfer Learning via Knowledge Distillation [J].
Wang, Aili ;
Zhang, Yutong ;
Yan, Yixin .
IEEE ACCESS, 2021, 9 :29530-29540