A Network Resource Aware Federated Learning Approach using Knowledge Distillation

被引:2
作者
Mishra, Rahul [1 ]
Gupta, Hari Prabhat [1 ]
Dutta, Tanima [1 ]
机构
[1] Indian Inst Technol BHU, Dept Comp Sci & Engn, Varanasi, Uttar Pradesh, India
来源
IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (IEEE INFOCOM WKSHPS 2021) | 2021年
关键词
Federated learning; network resources;
D O I
10.1109/INFOCOMWKSHPS51825.2021.9484597
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated Learning (FL) has gained unprecedented growth in the past few years by facilitating data privacy. This poster proposes a network resource aware federated learning approach that utilizes the concept of knowledge distillation to train a machine learning model by using local data samples. The approach creates different groups based on the bandwidth between clients and server and iteratively applies FL to each group by compressing the model using knowledge distillation. The approach reduces the bandwidth requirement and generates a more robust model trained on the data of all clients without revealing privacy.
引用
收藏
页数:2
相关论文
共 3 条
[1]  
Hinton G, 2015, Arxiv, DOI arXiv:1503.02531
[2]   A Sensors Based Deep Learning Model for Unseen Locomotion Mode Identification using Multiple Semantic Matrices [J].
Mishra, Rahul ;
Gupta, Ashish ;
Gupta, Hari Prabhat ;
Dutta, Tanima .
IEEE TRANSACTIONS ON MOBILE COMPUTING, 2022, 21 (03) :799-810
[3]  
Tran NH, 2019, IEEE INFOCOM SER, P1387, DOI [10.1109/infocom.2019.8737464, 10.1109/INFOCOM.2019.8737464]