SparseBatch: Communication-efficient Federated Learning With Partially Homomorphic Encryption

被引:0
作者
Wang, Chong [1 ]
Wang, Jing [2 ]
Lou, Zheng [1 ]
Kong, Linghai [2 ]
Tao, Weisong [1 ]
Wang, Yun [2 ]
机构
[1] State Grid Jiangsu Elect Power Co Ltd, Nanjing 210024, Peoples R China
[2] Southeast Univ, Sch Comp Sci & Engn, Nanjing 211189, Peoples R China
来源
JOURNAL OF APPLIED SCIENCE AND ENGINEERING | 2025年 / 28卷 / 08期
关键词
Homomorphic encryption; Federated Learning; Gradient sparsification; Gradient quantization; Lion optimizer;
D O I
10.6180/jase.202508_28(8).0003
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Cross-silo federated learning (FL) enables collaborative model training among various organizations (e.g., financial or medical). It operates by aggregating local gradient updates contributed by participating clients, all the while safeguarding the privacy of sensitive data. Industrial FL frameworks employ additively homomorphic encryption (HE) to ensure that local gradient updates are masked during aggregation, guaranteeing no update is revealed. However, this measure has resulted in significant computational and communication overhead. Encryption and decryption operations have occupied the majority of the training time. In addition, the bit length of ciphertext is two orders of magnitude larger than that of plaintext, inflating the data transfer amount. In this paper, we present a new gradient sparsification method, SparseBatch. By designing a new general gradient correction method and using Lion optimizer's gradient quantization method, SparseBatch combines gradient sparsification and quantization. Experimental results show that compared with BatchCrypt, SparseBatch reduces the computation and communication overhead by 5x, and the accuracy reduction is less
引用
收藏
页码:1645 / 1656
页数:12
相关论文
共 31 条
  • [1] Alistarh D, 2017, ADV NEUR IN, V30
  • [2] Bukaty P., 2019, The California consumer privacy act (CCPA): An implementation guide
  • [3] Chen CY, 2018, AAAI CONF ARTIF INTE, P2827
  • [4] Chen X, 2024, P 37 INT C NEUR INF, V37, P49205
  • [5] Creemers R., 2023, China Law Soc. Rev, V6, P111, DOI [10.1163/25427466-06020001, DOI 10.1163/25427466-06020001]
  • [6] Dryden N, 2016, PROCEEDINGS OF 2016 2ND WORKSHOP ON MACHINE LEARNING IN HPC ENVIRONMENTS (MLHPC), P1, DOI [10.1109/MLHPC.2016.4, 10.1109/MLHPC.2016.004]
  • [7] Duchi J, 2011, J MACH LEARN RES, V12, P2121
  • [8] Aji AF, 2017, Arxiv, DOI arXiv:1704.05021
  • [9] Han S., 2018, arXiv
  • [10] Stochastic distributed learning with gradient quantization and double-variance reduction
    Horvath, Samuel
    Kovalev, Dmitry
    Mishchenko, Konstantin
    Richtarik, Peter
    Stich, Sebastian
    [J]. OPTIMIZATION METHODS & SOFTWARE, 2023, 38 (01) : 91 - 106