Mix2FLD: Downlink Federated Learning After Uplink Federated Distillation With Two-Way Mixup

被引:34
|
作者
Oh, Seungeun [1 ]
Park, Jihong [2 ]
Jeong, Eunjeong [1 ]
Kim, Hyesung [3 ]
Bennis, Mehdi [4 ]
Kim, Seong-Lyun [1 ]
机构
[1] Yonsei Univ, Sch Elect & Elect Engn, Seoul 03722, South Korea
[2] Deakin Univ, Sch Informat Technol, Geelong, Vic 3220, Australia
[3] Samsung Elect, Samsung Res, Seoul 16677, South Korea
[4] Univ Oulu, Ctr Wireless Commun, Oulu 90500, Finland
基金
新加坡国家研究基金会; 芬兰科学院;
关键词
Servers; Uplink; Downlink; Collaborative work; Data models; Data privacy; Wireless communication; Distributed machine learning; on-device learning; federated learning; federated distillation; uplink-downlink asymmetry;
D O I
10.1109/LCOMM.2020.3003693
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
This letter proposes a novel communication-efficient and privacy-preserving distributed machine learning framework, coined Mix2FLD. To address uplink-downlink capacity asymmetry, local model outputs are uploaded to a server in the uplink as in federated distillation (FD), whereas global model parameters are downloaded in the downlink as in federated learning (FL). This requires a model output-to-parameter conversion at the server, after collecting additional data samples from devices. To preserve privacy while not compromising accuracy, linearly mixed-up local samples are uploaded, and inversely mixed up across different devices at the server. Numerical evaluations show that Mix2FLD achieves up to 16.7% higher test accuracy while reducing convergence time by up to 18.8% under asymmetric uplink-downlink channels compared to FL.
引用
收藏
页码:2211 / 2215
页数:5
相关论文
共 5 条
  • [1] Mix2SFL: Two-Way Mixup for Scalable, Accurate, and Communication-Efficient Split Federated Learning
    Oh, Seungeun
    Nam, Hyelin
    Park, Jihong
    Vepakomma, Praneeth
    Raskar, Ramesh
    Bennis, Mehdi
    Kim, Seong-Lyun
    IEEE TRANSACTIONS ON BIG DATA, 2024, 10 (03) : 238 - 248
  • [2] Fed2KD: Heterogeneous Federated Learning for Pandemic Risk Assessment via Two-Way Knowledge Distillation
    Sun, Chuanneng
    Jiang, Tingcong
    Zonouz, Saman
    Pompili, Dario
    17TH CONFERENCE ON WIRELESS ON-DEMAND NETWORK SYSTEMS AND SERVICES (WONS 2022), 2021,
  • [3] Two-way Delayed Updates with Model Similarity in Communication-Efficient Federated Learning
    Mao, Yingchi
    Wang, Zibo
    Wu, Jun
    Shen, Lijuan
    Xu, Shufang
    Wu, Jie
    2023 19TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING, MSN 2023, 2023, : 520 - 527
  • [4] Reliable incentive mechanism in hierarchical federated learning based on two-way reputation and contract theory
    Cai, Hongyun
    Gao, Lijing
    Wang, Jiahao
    Li, Fengyu
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2024, 159 : 533 - 544
  • [5] Fed2PKD: Bridging Model Diversity in Federated Learning via Two-Pronged Knowledge Distillation
    Xie, Zaipeng
    Xu, Han
    Gao, Xing
    Jiang, Junchen
    Han, Ruiqian
    2024 IEEE 17TH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING, CLOUD 2024, 2024, : 1 - 11