1-4edGraph-KD: An Effective Federated Graph Learning Scheme Based on Knowledge Distillation

被引:3
作者
Wang, Shiyu [1 ]
Xie, Jiahao [1 ]
Lu, Mingming [1 ]
Xiong, Neal N. [2 ]
机构
[1] Cent South Univ, Dept Comp Sci & Engn, Changsha, Peoples R China
[2] Sul Ross State Univ, Dept Comp Math & Phys Sci, Alpine, TX USA
来源
2023 IEEE 9TH INTL CONFERENCE ON BIG DATA SECURITY ON CLOUD, BIGDATASECURITY, IEEE INTL CONFERENCE ON HIGH PERFORMANCE AND SMART COMPUTING, HPSC AND IEEE INTL CONFERENCE ON INTELLIGENT DATA AND SECURITY, IDS | 2023年
基金
中国国家自然科学基金;
关键词
federated graph neural network; knowledge distillation; model heterogeneity; WIRELESS; BLOCKCHAIN;
D O I
10.1109/BigDataSecurity-HPSC-IDS58521.2023.00032
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Neural Networks (GNNs) have achieved success in a variety of domains due to their potent graphdata processing skills. However, gathering graph structure data from various universities and applying GNNs for centralized training is highly challenging due to privacy concerns and regulatory limitations. As a solution, Federated Graph Neural Networks (Fed-GNNs) do not require sharing data but support collaborative training of public models by sharing parameters or features between multiple parties. Thus, Fed-GNNs has gained more attention recently. However, existing Fed-GNNs schemes do not consider the problem of participants of the public model often having different private GNN models, i.e., the model heterogeneity problem, which can lead to failure in the model heterogeneity scenario. To address this issue, this paper explores an effective and novel Federated Graph Learning scheme Based on Knowledge Distillation models (FedGraph-KD). On one hand, each client trains its local models through knowledge distillation. On the other hand, this paper uses a federated learning framework to update the shared model parameters. Extensive experiments and analyses on several different graph classification datasets demonstrate the effectiveness of our approach.
引用
收藏
页码:130 / 134
页数:5
相关论文
共 28 条
  • [21] CGKDFL: A Federated Learning Approach Based on Client Clustering and Generator-Based Knowledge Distillation for Heterogeneous Data
    Zhang, Sanfeng
    Xu, Hongzhen
    Yu, Xiaojun
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2025, 37 (9-11)
  • [22] GFD-SSL: generative federated knowledge distillation-based semi-supervised learning
    Karami, Ali
    Ramezani, Reza
    Baraani Dastjerdi, Ahmad
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (12) : 5509 - 5529
  • [23] ABC-KD: Attention-Based-Compression Knowledge Distillation for Deep Learning-Based Noise Suppression
    Wan, Yixin
    Zhou, Yuan
    Peng, Xiulian
    Chang, Kai-Wei
    Lu, Yan
    INTERSPEECH 2023, 2023, : 2528 - 2532
  • [24] Mutual Knowledge-Distillation-Based Federated Learning for Short-Term Forecasting in Electric IoT Systems
    Tong, Cheng
    Zhang, Linghua
    Ding, Yin
    Yue, Dong
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (19): : 31190 - 31205
  • [25] Efficient Vehicle Selection and Resource Allocation for Knowledge Distillation-Based Federated Learning in UAV-Assisted VEC
    Li, Chunlin
    Zhang, Yong
    Yu, Long
    Yang, Mengjie
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2025,
  • [26] DCSR: A deep continual learning-based scheme for image super resolution using knowledge distillation
    Esmaeilzehi, Alireza
    Zaredar, Hossein
    Ahmad, M. Omair
    APPLIED INTELLIGENCE, 2025, 55 (07)
  • [27] KD-PAR: A knowledge distillation-based pedestrian attribute recognition model with multi-label mixed feature learning network
    Wu, Peishu
    Wang, Zidong
    Li, Han
    Zeng, Nianyin
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 237
  • [28] FedKD-IDS: A robust intrusion detection system using knowledge distillation-based semi-supervised federated learning and anti-poisoning attack mechanism
    Quyen, Nguyen Huu
    Duy, Phan The
    Nguyen, Ngo Thao
    Khoa, Nghi Hoang
    Pham, Van-Hau
    INFORMATION FUSION, 2025, 117