Ensemble Distillation Based Adaptive Quantization for Supporting Federated Learning in Wireless Networks

被引:5
作者
Liu, Yi-Jing [1 ,2 ]
Feng, Gang [1 ,2 ]
Niyato, Dusit [3 ]
Qin, Shuang [1 ,2 ]
Zhou, Jianhong [4 ]
Li, Xiaoqian [1 ,2 ]
Xu, Xinyi [1 ,2 ]
机构
[1] Univ Elect Sci & Technol China, Natl Key Lab Commun, Chengdu 611731, Peoples R China
[2] Univ Elect Sci & Technol China, Yangtze Delta Reg Inst Huzhou, Huzhou 313001, Peoples R China
[3] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
[4] Xihua Univ, Sch Comp & Software Engn, Chengdu 610039, Peoples R China
基金
美国国家科学基金会;
关键词
Federated learning; wireless network; adaptive quantization; ensemble distillation; heterogeneous models; AGGREGATION;
D O I
10.1109/TWC.2022.3222717
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning (FL) has become a promising technique for developing intelligent wireless networks. In traditional FL paradigms, local models are usually required to be homogeneous for aggregation. However, due to heterogeneous models coming with wireless sysTem heterogeneity, it is preferable for user equipments (UEs) to undertake appropriate amount of computing and/or data transmission work based on sysTem constraints. Meanwhile, considerable communication costs are incurred by model training, when a large number of UEs participate in FL and/or the transmitted models are large. Therefore, resource-efficient training schemes for heterogeneous models are essential for enabling FL-based intelligent wireless networks. In this paper, we propose an adaptive quantization scheme based on ensemble distillation (AQeD), to facilitate heterogeneous model training. We first partition and group the participating UEs into clusters, where the local models in specific clusters are homogeneous with different quantization levels. Then we propose an augmented loss function by jointly considering ensemble distillation loss, quantization levels and wireless resources constraints. In AQeD, model aggregations are performed at two levels: model aggregation for individual clusters and distillation loss aggregation for cluster ensembles. Numerical results show that the AQeD scheme can significantly reduce communication costs and training time in comparison with some state-of-the-art solutions.
引用
收藏
页码:4013 / 4027
页数:15
相关论文
共 50 条
  • [21] Latency-Efficient Wireless Federated Learning With Quantization and Scheduling
    Yan, Zhigang
    Li, Dong
    Yu, Xianhua
    Zhang, Zhichao
    IEEE COMMUNICATIONS LETTERS, 2022, 26 (11) : 2621 - 2625
  • [22] Smart algorithm in wireless networks for video streaming based on adaptive quantization
    Taha, Miran
    Ali, Aree
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (09)
  • [23] Adaptive Network Pruning for Wireless Federated Learning
    Liu, Shengli
    Yu, Guanding
    Yin, Rui
    Yuan, Jiantao
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2021, 10 (07) : 1572 - 1576
  • [24] Adaptive Sparsification and Quantization for Enhanced Energy Efficiency in Federated Learning
    Marnissi, Ouiame
    El Hammouti, Hajar
    Bergou, El Houcine
    IEEE OPEN JOURNAL OF THE COMMUNICATIONS SOCIETY, 2024, 5 : 4307 - 4321
  • [25] FedACQ: adaptive clustering quantization of model parameters in federated learning
    Tian, Tingting
    Shi, Hongjian
    Ma, Ruhui
    Liu, Yuan
    INTERNATIONAL JOURNAL OF WEB INFORMATION SYSTEMS, 2024, 20 (01) : 88 - 110
  • [26] FedRAD: Heterogeneous Federated Learning via Relational Adaptive Distillation
    Tang, Jianwu
    Ding, Xuefeng
    Hu, Dasha
    Guo, Bing
    Shen, Yuncheng
    Ma, Pan
    Jiang, Yuming
    SENSORS, 2023, 23 (14)
  • [27] Fedadkd:heterogeneous federated learning via adaptive knowledge distillation
    Song, Yalin
    Liu, Hang
    Zhao, Shuai
    Jin, Haozhe
    Yu, Junyang
    Liu, Yanhong
    Zhai, Rui
    Wang, Longge
    PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (04)
  • [28] Adaptive Backdoor Attacks Against Dataset Distillation for Federated Learning
    Chai, Ze
    Gao, Zhipeng
    Lin, Yijing
    Zhao, Chen
    Yu, Xinlei
    Xie, Zhiqiang
    ICC 2024 - IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2024, : 4614 - 4619
  • [29] Adaptive User Selection and Bandwidth Allocation for Fast Convergence of Federated Learning in Wireless Networks
    Pan, Jiaqi
    Chen, Zhikun
    Zhao, Ming
    Zhang, Sihai
    Zhu, Jinkang
    2023 INTERNATIONAL CONFERENCE ON FUTURE COMMUNICATIONS AND NETWORKS, FCN, 2023,
  • [30] Federated Learning based Audio Semantic Communication over Wireless Networks
    Tong, Haonan
    Yang, Zhaohui
    Wang, Sihua
    Hu, Ye
    Saad, Walid
    Yin, Changchuan
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,