A federated deep learning framework for privacy preservation and communication efficiency

被引:24
作者
Cao, Tien-Dung [1 ]
Tram, Truong-Huu [2 ]
Tran, Hien [1 ]
Tran, Khanh [3 ,4 ]
机构
[1] Tan Tao Univ, Sch Engn, Tan Duc E City, Long An Provinc, Vietnam
[2] Singapore Inst Technol, Singapore, Singapore
[3] Vietnam Natl Univ Ho Chi Minh City, Int Univ, Dept Math, Ho Chi Minh City, Vietnam
[4] Tan Tao Univ, Tan Duc E City, Long An, Vietnam
关键词
Federated learning; Deep learning; Parallel training; Communication efficiency; Privacy preserving;
D O I
10.1016/j.sysarc.2022.102413
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Deep learning has achieved great success in many applications. However, its deployment in practice has been hurdled by two issues: the privacy of data that has to be aggregated centrally for model training and high communication overhead due to transmission of large amount of data usually geographically distributed. Addressing both issues is challenging and most existing works could not provide an efficient solution. In this paper, we develop FedPC, a Federated Deep Learning Framework for Privacy Preservation and Communication Efficiency. The framework allows a model to be learned on multiple private datasets while not revealing any information of training data, even with intermediate data. The framework also minimizes the amount of data exchanged to update the model. We formally prove the convergence of the learning model when training with FedPC and its privacy-preserving property. We perform extensive experiments to evaluate the performance of FedPC in terms of the approximation to the upper-bound performance (when training centrally) and communication overhead. The results show that FedPC maintains the performance approximation of the models within 8.5% of the centrally-trained models when data is distributed to 10 computing nodes. FedPC also reduces the communication overhead by up to 42.20% compared to existing works.
引用
收藏
页数:13
相关论文
共 43 条
  • [11] RUN-LENGTH ENCODINGS
    GOLOMB, SW
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 1966, 12 (03) : 399 - +
  • [12] Goodfellow K., 2018, TECHNICAL REPORT CLE, P1
  • [13] Efficient and Privacy-Enhanced Federated Learning for Industrial Artificial Intelligence
    Hao, Meng
    Li, Hongwei
    Luo, Xizhao
    Xu, Guowen
    Yang, Haomiao
    Liu, Sen
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2020, 16 (10) : 6532 - 6542
  • [14] Deep Residual Learning for Image Recognition
    He, Kaiming
    Zhang, Xiangyu
    Ren, Shaoqing
    Sun, Jian
    [J]. 2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 770 - 778
  • [15] Caffe: Convolutional Architecture for Fast Feature Embedding
    Jia, Yangqing
    Shelhamer, Evan
    Donahue, Jeff
    Karayev, Sergey
    Long, Jonathan
    Girshick, Ross
    Guadarrama, Sergio
    Darrell, Trevor
    [J]. PROCEEDINGS OF THE 2014 ACM CONFERENCE ON MULTIMEDIA (MM'14), 2014, : 675 - 678
  • [16] Jia Z., 2019, P MACH LEARN SYST, Vvol 1, P1
  • [17] Kingma D. P., 2015, P 3 INT C LEARN REPR, P1
  • [18] Le Q.V., 2011, P 28 INT C MACH LEAR, P265
  • [19] Privacy-Preserving Deep Learning via Weight Transmission
    Le Trieu Phong
    Tran Thi Phuong
    [J]. IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2019, 14 (11) : 3003 - 3015
  • [20] Li X., 2016, Big Data: Principles and Paradigms, P95, DOI 10.1016/B978-0-12-805394-2.00004-0