1-4edGraph-KD: An Effective Federated Graph Learning Scheme Based on Knowledge Distillation

被引:3
|
作者
Wang, Shiyu [1 ]
Xie, Jiahao [1 ]
Lu, Mingming [1 ]
Xiong, Neal N. [2 ]
机构
[1] Cent South Univ, Dept Comp Sci & Engn, Changsha, Peoples R China
[2] Sul Ross State Univ, Dept Comp Math & Phys Sci, Alpine, TX USA
来源
2023 IEEE 9TH INTL CONFERENCE ON BIG DATA SECURITY ON CLOUD, BIGDATASECURITY, IEEE INTL CONFERENCE ON HIGH PERFORMANCE AND SMART COMPUTING, HPSC AND IEEE INTL CONFERENCE ON INTELLIGENT DATA AND SECURITY, IDS | 2023年
基金
中国国家自然科学基金;
关键词
federated graph neural network; knowledge distillation; model heterogeneity; WIRELESS; BLOCKCHAIN;
D O I
10.1109/BigDataSecurity-HPSC-IDS58521.2023.00032
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Neural Networks (GNNs) have achieved success in a variety of domains due to their potent graphdata processing skills. However, gathering graph structure data from various universities and applying GNNs for centralized training is highly challenging due to privacy concerns and regulatory limitations. As a solution, Federated Graph Neural Networks (Fed-GNNs) do not require sharing data but support collaborative training of public models by sharing parameters or features between multiple parties. Thus, Fed-GNNs has gained more attention recently. However, existing Fed-GNNs schemes do not consider the problem of participants of the public model often having different private GNN models, i.e., the model heterogeneity problem, which can lead to failure in the model heterogeneity scenario. To address this issue, this paper explores an effective and novel Federated Graph Learning scheme Based on Knowledge Distillation models (FedGraph-KD). On one hand, each client trains its local models through knowledge distillation. On the other hand, this paper uses a federated learning framework to update the shared model parameters. Extensive experiments and analyses on several different graph classification datasets demonstrate the effectiveness of our approach.
引用
收藏
页码:130 / 134
页数:5
相关论文
共 28 条
  • [1] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [2] Two-stage model fusion scheme based on knowledge distillation for stragglers in federated learning
    Xu, Jiuyun
    Li, Xiaowen
    Zhu, Kongshang
    Zhou, Liang
    Zhao, Yingzhi
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, : 3067 - 3083
  • [3] A Decentralized Federated Learning Based on Node Selection and Knowledge Distillation
    Zhou, Zhongchang
    Sun, Fenggang
    Chen, Xiangyu
    Zhang, Dongxu
    Han, Tianzhen
    Lan, Peng
    MATHEMATICS, 2023, 11 (14)
  • [4] A Personalized Federated Learning Method Based on Clustering and Knowledge Distillation
    Zhang, Jianfei
    Shi, Yongqiang
    ELECTRONICS, 2024, 13 (05)
  • [5] SGKD: A Scalable and Effective Knowledge Distillation Framework for Graph Representation Learning
    He, Yufei
    Ma, Yao
    2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW, 2022, : 666 - 673
  • [6] A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
    Su, Caiyu
    Wei, Jinri
    Lei, Yuan
    Li, Jiahui
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [7] A Personalized Federated Learning Algorithm Based on Meta-Learning and Knowledge Distillation
    Sun Y.
    Shi Y.
    Wang Z.
    Li M.
    Si P.
    Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2023, 46 (01): : 12 - 18
  • [8] A Semi-Supervised Federated Learning Scheme via Knowledge Distillation for Intrusion Detection
    Zhao, Ruijie
    Yang, Linbo
    Wang, Yijun
    Xue, Zhi
    Gui, Guan
    Ohtsukit, Tomoaki
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 2688 - 2693
  • [9] Heterogeneous Federated Learning Framework for IIoT Based on Selective Knowledge Distillation
    Guo, Sheng
    Chen, Hui
    Liu, Yang
    Yang, Chengyi
    Li, Zengxiang
    Jin, Cheng Hao
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2025, 21 (02) : 1078 - 1089
  • [10] A Personalized Federated Learning Method Based on Knowledge Distillation and Differential Privacy
    Jiang, Yingrui
    Zhao, Xuejian
    Li, Hao
    Xue, Yu
    ELECTRONICS, 2024, 13 (17)