Ensemble Distillation Based Adaptive Quantization for Supporting Federated Learning in Wireless Networks

被引:11
作者
Liu, Yi-Jing [1 ,2 ]
Feng, Gang [1 ,2 ]
Niyato, Dusit [3 ]
Qin, Shuang [1 ,2 ]
Zhou, Jianhong [4 ]
Li, Xiaoqian [1 ,2 ]
Xu, Xinyi [1 ,2 ]
机构
[1] Univ Elect Sci & Technol China, Natl Key Lab Commun, Chengdu 611731, Peoples R China
[2] Univ Elect Sci & Technol China, Yangtze Delta Reg Inst Huzhou, Huzhou 313001, Peoples R China
[3] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
[4] Xihua Univ, Sch Comp & Software Engn, Chengdu 610039, Peoples R China
基金
美国国家科学基金会;
关键词
Federated learning; wireless network; adaptive quantization; ensemble distillation; heterogeneous models; AGGREGATION;
D O I
10.1109/TWC.2022.3222717
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning (FL) has become a promising technique for developing intelligent wireless networks. In traditional FL paradigms, local models are usually required to be homogeneous for aggregation. However, due to heterogeneous models coming with wireless sysTem heterogeneity, it is preferable for user equipments (UEs) to undertake appropriate amount of computing and/or data transmission work based on sysTem constraints. Meanwhile, considerable communication costs are incurred by model training, when a large number of UEs participate in FL and/or the transmitted models are large. Therefore, resource-efficient training schemes for heterogeneous models are essential for enabling FL-based intelligent wireless networks. In this paper, we propose an adaptive quantization scheme based on ensemble distillation (AQeD), to facilitate heterogeneous model training. We first partition and group the participating UEs into clusters, where the local models in specific clusters are homogeneous with different quantization levels. Then we propose an augmented loss function by jointly considering ensemble distillation loss, quantization levels and wireless resources constraints. In AQeD, model aggregations are performed at two levels: model aggregation for individual clusters and distillation loss aggregation for cluster ensembles. Numerical results show that the AQeD scheme can significantly reduce communication costs and training time in comparison with some state-of-the-art solutions.
引用
收藏
页码:4013 / 4027
页数:15
相关论文
共 50 条
[31]   Fedadkd:heterogeneous federated learning via adaptive knowledge distillation [J].
Song, Yalin ;
Liu, Hang ;
Zhao, Shuai ;
Jin, Haozhe ;
Yu, Junyang ;
Liu, Yanhong ;
Zhai, Rui ;
Wang, Longge .
PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (04)
[32]   Adaptive Backdoor Attacks Against Dataset Distillation for Federated Learning [J].
Chai, Ze ;
Gao, Zhipeng ;
Lin, Yijing ;
Zhao, Chen ;
Yu, Xinlei ;
Xie, Zhiqiang .
ICC 2024 - IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2024, :4614-4619
[33]   FedRAD: Heterogeneous Federated Learning via Relational Adaptive Distillation [J].
Tang, Jianwu ;
Ding, Xuefeng ;
Hu, Dasha ;
Guo, Bing ;
Shen, Yuncheng ;
Ma, Pan ;
Jiang, Yuming .
SENSORS, 2023, 23 (14)
[34]   Federated Learning based Audio Semantic Communication over Wireless Networks [J].
Tong, Haonan ;
Yang, Zhaohui ;
Wang, Sihua ;
Hu, Ye ;
Saad, Walid ;
Yin, Changchuan .
2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
[35]   FEDRESOURCE: Federated Learning Based Resource Allocation in Modern Wireless Networks [J].
Satheesh, P. G. ;
Sasikala, T. .
INTERNATIONAL JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING SYSTEMS, 2023, 14 (09) :1023-1030
[36]   Scheduling Policies for Federated Learning in Wireless Networks [J].
Yang, Howard H. ;
Liu, Zuozhu ;
Quek, Tony Q. S. ;
Poor, H. Vincent .
IEEE TRANSACTIONS ON COMMUNICATIONS, 2020, 68 (01) :317-333
[37]   Lightwave Power Transfer for Federated Learning-Based Wireless Networks [J].
Tran, Ha-Vu ;
Kaddoum, Georges ;
Elgala, Hany ;
Abou-Rjeily, Chadi ;
Kaushal, Hemani .
IEEE COMMUNICATIONS LETTERS, 2020, 24 (07) :1472-1476
[38]   ENHANCING FEDERATED LEARNING ROBUSTNESS IN WIRELESS NETWORKS [J].
Shaban, Zubair ;
Prasad, Ranjitha .
PROCEEDINGS OF 7TH JOINT INTERNATIONAL CONFERENCE ON DATA SCIENCE AND MANAGEMENT OF DATA, CODS-COMAD 2024, 2024, :597-598
[39]   Federated Learning Algorithm Based on Knowledge Distillation [J].
Jiang, Donglin ;
Shan, Chen ;
Zhang, Zhihui .
2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, :163-167
[40]   Adaptive User Scheduling and Resource Allocation in Wireless Federated Learning Networks : A Deep Reinforcement Learning Approach [J].
Wu, Changxiang ;
Ren, Yijing ;
So, Daniel K. C. .
ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, :1219-1225