Knowledge distillation in federated learning: a comprehensive survey

被引:0
作者
Salman, Hassan [1 ]
Zaki, Chamseddine [2 ]
Charara, Nour [3 ]
Guehis, Sonia [4 ]
Pradat-Peyre, Jean-Francois [1 ]
Nasser, Abbass [5 ]
机构
[1] Sorbonne Univ, CNRS, UMR 7606, LIP6, Paris, France
[2] Amer Univ Middle East, Coll Engn & Technol, Egaila 54200, Kuwait
[3] Amer Univ Culture & Educ, Fac Arts & Sci, Lab Comp Sci Dept, ICCS, Beirut, Lebanon
[4] PSL Res Univ, Paris Dauphine Univ, LAMSADE, CNRS,UMR 7243, Paris, France
[5] Holy Spirit Univ Kaslik USEK, Business Sch, POB 446, Jounieh, Lebanon
关键词
Federated Learning; Knowledge distillation; Transfer Learning; Data Heterogeneity; Model Heterogeneity; Non-independent-identical Distribution;
D O I
10.1007/s10791-025-09657-4
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning, often known as FL, is an approach that has recently emerged as a potentially helpful method for training machine learning models in a distributed manner without the requirement of central data storage. However, when attempting to aggregate information, the inherent variety and discrepancies in the data contributed by many FL contributors might be a substantial obstacle. In order to address this problem, researchers have offered various solutions, one of which is called knowledge distillation (KD). Such a solution seeks to transfer knowledge from a larger, more precise model to a smaller model, thus enhancing its performance. This study provides a detailed examination of the effectiveness of KD in responding to these challenges posed by FL. We comprehensively review existing research, emphasizing the benefits and limitations of using these techniques in FL and discussing the numerous challenges and research questions in this field.
引用
收藏
页数:40
相关论文
共 130 条
[1]  
Anil R, 2020, Arxiv, DOI arXiv:1804.03235
[2]  
[Anonymous], 2011, An analysis of single-layer networks in unsupervised feature learning
[3]   Knowledge distillation on neural networks for evolving graphs [J].
Antaris, Stefanos ;
Rafailidis, Dimitrios ;
Girdzijauskas, Sarunas .
SOCIAL NETWORK ANALYSIS AND MINING, 2021, 11 (01)
[4]   Federated Learning for Healthcare: Systematic Review and Architecture Proposal [J].
Antunes, Rodolfo Stoffel ;
da Costa, Cristiano Andre ;
Kuederle, Arne ;
Yari, Imrana Abdullahi ;
Eskofier, Bjoern .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2022, 13 (04)
[5]  
Asif U, 2020, Arxiv, DOI arXiv:1909.08097
[6]  
Beutel DJ, 2022, Arxiv, DOI arXiv:2007.14390
[7]  
Sau BB, 2016, Arxiv, DOI arXiv:1610.09650
[8]   Practical Secure Aggregation for Privacy-Preserving Machine Learning [J].
Bonawitz, Keith ;
Ivanov, Vladimir ;
Kreuter, Ben ;
Marcedone, Antonio ;
McMahan, H. Brendan ;
Patel, Sarvar ;
Ramage, Daniel ;
Segal, Aaron ;
Seth, Karn .
CCS'17: PROCEEDINGS OF THE 2017 ACM SIGSAC CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY, 2017, :1175-1191
[9]  
Caruana Rich, 2012, Neural Networks: Tricks of the Trade. Second Edition: LNCS 7700, P163, DOI 10.1007/978-3-642-35289-8_12
[10]  
Chang HY, 2019, Arxiv, DOI arXiv:1912.11279