Decentralized Federated Graph Learning via Surrogate Model

被引:0
作者
Zhang, Bolin [1 ]
Gu, Ruichun [1 ]
Liu, Haiying [1 ]
机构
[1] Inner Mongolia Univ Sci & Technol, Sch Digital & Intelligent Ind, Sch Cyber Sci & Technol, Baotou 014010, Peoples R China
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2025年 / 82卷 / 02期
关键词
Federated learning; federated graph learning; decentralized; graph neural network; privacy preservation;
D O I
10.32604/cmc.2024.060331
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Graph Learning (FGL) enables model training without requiring each client to share local graph data, effectively breaking data silos by aggregating the training parameters from each terminal while safeguarding data privacy. Traditional FGL relies on a centralized server for model aggregation; however, this central server presents challenges such as a single point of failure and high communication overhead. Additionally, efficiently training a robust personalized local model for each client remains a significant objective in federated graph learning. To address these issues, we propose a decentralized Federated Graph Learning framework with efficient communication, termed Decentralized Federated Graph Learning via Surrogate Model (SD_FGL). In SD_FGL, each client is required to maintain two models: a private model and a surrogate model. The surrogate model is publicly shared and can exchange and update information directly with any client, eliminating the need for a central server and reducing communication overhead. The private model is independently trained by each client, allowing it to calculate similarity with other clients based on local data as well as information shared through the surrogate model. This enables the private model to better adjust its training strategy and selectively update its parameters. Additionally, local differential privacy is incorporated into the surrogate model training process to enhance privacy protection. Testing on three real-world graph datasets demonstrates that the proposed framework improves accuracy while achieving decentralized Federated Graph Learning with lower communication overhead and stronger privacy safeguards.
引用
收藏
页码:2521 / 2535
页数:15
相关论文
共 22 条
  • [1] Zhang K., Yang C., Li X., Sun L., Yiu S., Subgraph federated learning with missing neighbor generation, 35th Conf. Neural Inform. Process. Syst. (NeurIPS 2021), pp. 6671-6682, (2021)
  • [2] Hamilton W. L., Graph Representation Learning, (2020)
  • [3] McMahan H. B., Moore E., Ramage D., Hampson S., Arcas A. B. Y., Communication-efficient learning of deep networks from decentralized data, (2016)
  • [4] Beltran E. T. M., Et al., Fedstellar: A platform for decentralized federated learning, Expert. Syst. Appl, 242, (2024)
  • [5] Yang Q., Liu Y., Cheng Y., Kang Y., Chen T., Yu H., Horizontal federated learning, Federated Learning, Synthesis Lectures on Artificial Intelligence and Machine Learning, pp. 49-67, (2020)
  • [6] Bonawitz K., Et al., Towards federated learning at scale: System design, Proc. Mach. Learn. Syst, pp. 374-388, (2019)
  • [7] Ziller A., Et al., PySyft: A library for easy federated learning, Federated Learning Systems: Towards Next-Generation AI, pp. 111-139, (2021)
  • [8] Roy A. G., Siddiqui S., Polsterl S., Navab N., Wachinger C., BrainTorrent: A peer-to-peer environment for decentralized federated learning, (2019)
  • [9] Lalitha A., Shekhar S., Javidi T., Koushanfar F., Fully decentralized federated learning, Third Workshop Bayesian Deep Learn. (NeurIPS), (2018)
  • [10] Hu C., Jiang J., Wang Z., Decentralized federated learning: A segmented gossip approach, (2019)