PFedKD: Personalized Federated Learning via Knowledge Distillation Using Unlabeled Pseudo Data for Internet of Things

被引:1
作者
Li, Hanxi [1 ]
Chen, Guorong [1 ]
Wang, Bin [1 ]
Chen, Zheng [1 ]
Zhu, Yongsheng [2 ]
Hu, Fuqiang [1 ]
Dai, Jiao [1 ]
Wang, Wei [1 ]
机构
[1] Beijing Jiaotong Univ, Beijing Key Lab Secur & Privacy Intelligent Transp, Beijing 100044, Peoples R China
[2] China Railway Informat Technol Grp Corp Ltd, Inst Comp Technol, Beijing 10081, Peoples R China
基金
中国国家自然科学基金; 北京市自然科学基金;
关键词
Data models; Internet of Things; Training; Adaptation models; Servers; Federated learning; Security; Performance evaluation; Data privacy; Prototypes; Federated learning (FL); knowledge distillation; personalization;
D O I
10.1109/JIOT.2025.3533003
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With the rapid advancement of wearable devices and Internet of Things (IoT) technologies, sensor data generated by edge devices has surged. This data is crucial for advancing IoT applications, including health status monitoring, abnormal behavior detection, and environmental monitoring. However, traditional centralized learning requires uploading data to a central server, raising security and privacy concerns and hindering data application. Federated learning (FL) offers a solution by enabling collaborative model training on IoT devices without transferring data from the local device. In practice, edge devices generate data that is often highly heterogeneous, making it challenging for the global FL model to capture local data distributions accurately, leading to significant performance degradation. Additionally, imbalanced edge device resources and limited bandwidth can cause data transmission delays or interruptions, impacting application feasibility. To address these issues, we propose PFedKD, a novel personalized FL algorithm based on knowledge distillation, aimed at enhancing the model's generalization ability and reducing communication overhead in heterogeneous IoT data environments. PFedKD constructs a public dataset using unlabeled pseudo data to extract knowledge from each client, training personalized models that fit local data distributions. This method controls dataset size while enhancing performance. During communication, only logits and class prototypes are transmitted, ensuring high communication efficiency. Sharpness aware minimization is introduced in local model training to optimize generalization. Additionally, we design a weight distribution mechanism based on client sample quality evaluation that optimizes knowledge aggregation and model personalization. Extensive experiments demonstrate that PFedKD significantly outperforms state-of-the-art baselines in both learning performance and communication efficiency.
引用
收藏
页码:16314 / 16324
页数:11
相关论文
共 47 条
[1]  
Acar D.A.E., 2021, P ICLR
[2]   Federated learning with hierarchical clustering of local updates to improve training on non-IID data [J].
Briggs, Christopher ;
Fan, Zhong ;
Andras, Peter .
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
[3]   FairReward: Towards Fair Reward Distribution Using Equity Theory in Blockchain-Based Federated Learning [J].
Chen, Guorong ;
Li, Chao ;
Wang, Wei ;
Duan, Li ;
Wang, Bin ;
Han, Zhen ;
Zhang, Xiangliang .
IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2025, 22 (02) :1612-1626
[4]  
Chen H., 2023, P ICLR
[5]  
Chen MY, 2021, AAAI CONF ARTIF INTE, V35, P7046
[6]  
Chen X., 2022, P ICLR, P1
[7]   Adaptive Federated Learning With Negative Inner Product Aggregation [J].
Deng, Wu ;
Chen, Xintao ;
Li, Xinyan ;
Zhao, Huimin .
IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (04) :6570-6581
[8]   Graph Neural Networks in IoT: A Survey [J].
Dong, Guimin ;
Tang, Mingyue ;
Wang, Zhiyuan ;
Gao, Jiechao ;
Guo, Sikun ;
Cai, Lihua ;
Gutierrez, Robert ;
Campbel, Bradford ;
Barnes, Laura E. ;
Boukhechba, Mehdi .
ACM TRANSACTIONS ON SENSOR NETWORKS, 2023, 19 (02)
[9]   DISTRIBUTED FOUNDATION MODELS FOR MULTI-MODAL LEARNING IN 6G WIRELESS NETWORKS [J].
Du, Jun ;
Lin, Tianyi ;
Jiang, Chunxiao ;
Yang, Qianqian ;
Bader, C. Faouzi ;
Han, Zhu .
IEEE WIRELESS COMMUNICATIONS, 2024, 31 (03) :20-30
[10]  
Fallah A, 2020, P NIPS, P1