Communication-Efficient and Privacy-Aware Distributed Learning

被引:2
作者
Gogineni, Vinay Chakravarthi [1 ,2 ]
Moradi, Ashkan [3 ]
Venkategowda, Naveen K. D. [4 ]
Werner, Stefan [3 ,5 ]
机构
[1] Norwegian Univ Sci & Technol, N-7491 Trondheim, Norway
[2] Univ Southern Denmark, Maersk Mc Kinney Moller Inst, SDU Appl AI & Data Sci, DK-5230 Odense, Denmark
[3] Norwegian Univ Sci & Technol, Dept Elect Syst, N-7491 Trondheim, Norway
[4] Linkoping Univ, S-60174 Norrkoping, Sweden
[5] Aalto Univ, Dept Informat & Commun Engn, Espoo 00076, Finland
来源
IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS | 2023年 / 9卷
关键词
Privacy; Distance learning; Computer aided instruction; Heuristic algorithms; Information processing; Differential privacy; Convergence; Average consensus; communication efficiency; distributed learning; multiagent systems; privacy-preservation; NETWORKS; SECURE;
D O I
10.1109/TSIPN.2023.3322783
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Communication efficiency and privacy are two key concerns in modern distributed computing systems. Towards this goal, this article proposes partial sharing private distributed learning (PPDL) algorithms that offer communication efficiency while preserving privacy, thus making them suitable for applications with limited resources in adversarial environments. First, we propose a noise injection-based PPDL algorithm that achieves communication efficiency by sharing only a fraction of the information at each consensus iteration and provides privacy by perturbing the information exchanged among neighbors. To further increase privacy, local information is randomly decomposed into private and public substates before sharing with the neighbors. This results in a decomposition- and noise-injection-based PPDL strategy in which only a freaction of the perturbeesd public substate is shared during local collaborations, whereas the private substate is updated locally without being shared. To determine the impact of communication savings and privacy preservation on the performance of distributed learning algorithms, we analyze the mean and mean-square convergence of the proposed algorithms. Moreover, we investigate the privacy of agents by characterizing privacy as the mean squared error of the estimate of private information at the honest-but-curious adversary. The analytical results show a tradeoff between communication efficiency and privacy in proposed PPDL algorithms, while decomposition- and noise-injection-based PPDL improves privacy compared to noise-injection-based PPDL. Lastly, numerical simulations corroborate the analytical findings.
引用
收藏
页码:705 / 720
页数:16
相关论文
共 50 条
  • [21] Communication-Efficient Distributed Learning Over Networks-Part II: Necessary Conditions for Accuracy
    Liu, Zhenyu
    Conti, Andrea
    Mitter, Sanjoy K.
    Win, Moe Z.
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2023, 41 (04) : 1102 - 1119
  • [22] Communication-Efficient Coded Distributed Multi-Task Learning
    Tang, Hua
    Hu, Haoyang
    Yuan, Kai
    Wu, Youlong
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [23] Communication-efficient Federated Learning with Privacy Enhancing via Probabilistic Scheduling
    Zhou, Ziao
    Huang, Shaoming
    Wu, Youlong
    Wen, Dingzhu
    Wang, Ting
    Cai, Haibin
    Shi, Yuanming
    2024 IEEE/CIC INTERNATIONAL CONFERENCE ON COMMUNICATIONS IN CHINA, ICCC, 2024,
  • [24] Communication-Efficient and Privacy-Preserving Aggregation in Federated Learning With Adaptability
    Sun, Xuehua
    Yuan, Zengsen
    Kong, Xianguang
    Xue, Liang
    He, Lang
    Lin, Ying
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (15): : 26430 - 26443
  • [25] Communication-Efficient Distributed Learning for High-Dimensional Support Vector Machines
    Zhou, Xingcai
    Shen, Hao
    MATHEMATICS, 2022, 10 (07)
  • [26] Layer-Based Communication-Efficient Federated Learning with Privacy Preservation
    Lian, Zhuotao
    Wang, Weizheng
    Huang, Huakun
    Su, Chunhua
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (02) : 256 - 263
  • [27] Intermittent Pulling With Local Compensation for Communication-Efficient Distributed Learning
    Wang, Haozhao
    Qu, Zhihao
    Guo, Song
    Gao, Xin
    Li, Ruixuan
    Ye, Baoliu
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2022, 10 (02) : 779 - 791
  • [28] AC-SGD: Adaptively Compressed SGD for Communication-Efficient Distributed Learning
    Yan, Guangfeng
    Li, Tan
    Huang, Shao-Lun
    Lan, Tian
    Song, Linqi
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2022, 40 (09) : 2678 - 2693
  • [29] An Efficient Privacy-Aware Authentication Scheme for Distributed Mobile Cloud Computing Services without Bilinear Pairings
    Xiong, Ling
    Peng, Tu
    Peng, Dai-Yuan
    Liang, Hong-Bin
    Liu, Zhi-Cai
    JOURNAL OF INFORMATION SCIENCE AND ENGINEERING, 2019, 35 (02) : 341 - 360
  • [30] Learning-Based Efficient Sparse Sensing and Recovery for Privacy-Aware IoMT
    Wei, Tiankuo
    Liu, Sicong
    Du, Xiaojiang
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (12) : 9948 - 9959