Ensemble Distillation Based Adaptive Quantization for Supporting Federated Learning in Wireless Networks

被引:5
|
作者
Liu, Yi-Jing [1 ,2 ]
Feng, Gang [1 ,2 ]
Niyato, Dusit [3 ]
Qin, Shuang [1 ,2 ]
Zhou, Jianhong [4 ]
Li, Xiaoqian [1 ,2 ]
Xu, Xinyi [1 ,2 ]
机构
[1] Univ Elect Sci & Technol China, Natl Key Lab Commun, Chengdu 611731, Peoples R China
[2] Univ Elect Sci & Technol China, Yangtze Delta Reg Inst Huzhou, Huzhou 313001, Peoples R China
[3] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
[4] Xihua Univ, Sch Comp & Software Engn, Chengdu 610039, Peoples R China
基金
美国国家科学基金会;
关键词
Federated learning; wireless network; adaptive quantization; ensemble distillation; heterogeneous models; AGGREGATION;
D O I
10.1109/TWC.2022.3222717
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning (FL) has become a promising technique for developing intelligent wireless networks. In traditional FL paradigms, local models are usually required to be homogeneous for aggregation. However, due to heterogeneous models coming with wireless sysTem heterogeneity, it is preferable for user equipments (UEs) to undertake appropriate amount of computing and/or data transmission work based on sysTem constraints. Meanwhile, considerable communication costs are incurred by model training, when a large number of UEs participate in FL and/or the transmitted models are large. Therefore, resource-efficient training schemes for heterogeneous models are essential for enabling FL-based intelligent wireless networks. In this paper, we propose an adaptive quantization scheme based on ensemble distillation (AQeD), to facilitate heterogeneous model training. We first partition and group the participating UEs into clusters, where the local models in specific clusters are homogeneous with different quantization levels. Then we propose an augmented loss function by jointly considering ensemble distillation loss, quantization levels and wireless resources constraints. In AQeD, model aggregations are performed at two levels: model aggregation for individual clusters and distillation loss aggregation for cluster ensembles. Numerical results show that the AQeD scheme can significantly reduce communication costs and training time in comparison with some state-of-the-art solutions.
引用
收藏
页码:4013 / 4027
页数:15
相关论文
共 50 条
  • [1] Resource Consumption for Supporting Federated Learning in Wireless Networks
    Liu, Yi-Jing
    Qin, Shuang
    Sun, Yao
    Feng, Gang
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2022, 21 (11) : 9974 - 9989
  • [2] Adaptive Transmission Scheduling in Wireless Networks for Asynchronous Federated Learning
    Lee, Hyun-Suk
    Lee, Jang-Won
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (12) : 3673 - 3687
  • [3] Ensemble Federated Learning With Non-IID Data in Wireless Networks
    Zhao, Zhongyuan
    Wang, Jingyi
    Hong, Wei
    Quek, Tony Q. S.
    Ding, Zhiguo
    Peng, Mugen
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (04) : 3557 - 3571
  • [4] Quantization Bits Allocation for Wireless Federated Learning
    Lan, Muhang
    Ling, Qing
    Xiao, Song
    Zhang, Wenyi
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (11) : 8336 - 8351
  • [5] Rate distortion optimization for adaptive gradient quantization in federated learning
    Chen, Guojun
    Xie, Kaixuan
    Luo, Wenqiang
    Xu, Yinfei
    Xin, Lun
    Song, Tiecheng
    Hu, Jing
    Digital Communications and Networks, 2024, 10 (06) : 1813 - 1825
  • [6] Adaptive Quantization Mechanism for Federated Learning Models Based on DAG Blockchain
    Li, Tong
    Yang, Chao
    Wang, Lei
    Li, Tingting
    Zhao, Hai
    Chen, Jiewei
    ELECTRONICS, 2023, 12 (17)
  • [7] Adaptive Semi-Asynchronous Federated Learning Over Wireless Networks
    Chen, Zhixiong
    Yi, Wenqiang
    Shin, Hyundong
    Nallanathan, Arumugam
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2025, 73 (01) : 394 - 409
  • [8] Adaptive Model Pruning and Personalization for Federated Learning Over Wireless Networks
    Liu, Xiaonan
    Ratnarajah, Tharmalingam
    Sellathurai, Mathini
    Eldar, Yonina C.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 4395 - 4411
  • [9] Adaptive Heterogeneous Client Sampling for Federated Learning Over Wireless Networks
    Luo, Bing
    Xiao, Wenli
    Wang, Shiqiang
    Huang, Jianwei
    Tassiulas, Leandros
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 9663 - 9677
  • [10] Wireless Federated Learning With Dynamic Quantization and Bandwidth Adaptation
    Feng, Wenjun
    Zhang, Xian
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2022, 11 (11) : 2335 - 2339