Enhanced Scalable Graph Neural Network via Knowledge Distillation

被引:0
作者
Mai, Chengyuan [1 ,2 ]
Chang, Yaomin [1 ,2 ]
Chen, Chuan [1 ,2 ]
Zheng, Zibin [2 ,3 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[2] Sun Yat Sen Univ, Natl Engn Res Ctr Digital Life, Guangzhou 510006, Peoples R China
[3] Sun Yat Sen Univ, Sch Software Engn, Zhuhai 519000, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph neural networks; Scalability; Computational modeling; Convolution; Training; Spectral analysis; Data mining; Graph neural network (GNN); knowledge distillation (KD); network embedding; scalability;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNNs) have achieved state-of-the-art performance in various graph representation learning scenarios. However, when applied to graph data in real world, GNNs have encountered scalability issues. Existing GNNs often have high computational load in both training and inference stages, making them incapable of meeting the performance needs of large-scale scenarios with a large number of nodes. Although several studies on scalable GNNs have developed, they either merely improve GNNs with limited scalability or come at the expense of reduced effectiveness. Inspired by knowledge distillation's (KDs) achievement in preserving performances while balancing scalability in computer vision and natural language processing, we propose an enhanced scalable GNN via KD (KD-SGNN) to improve the scalability and effectiveness of GNNs. On the one hand, KD-SGNN adopts the idea of decoupled GNNs, which decouples feature transformation and feature propagation in GNNs and leverages preprocessing techniques to improve the scalability of GNNs. On the other hand, KD-SGNN proposes two KD mechanisms (i.e., soft-target (ST) distillation and shallow imitation (SI) distillation) to improve the expressiveness. The scalability and effectiveness of KD-SGNN are evaluated on multiple real datasets. Besides, the effectiveness of the proposed KD mechanisms is also verified through comprehensive analyses.
引用
收藏
页码:1258 / 1271
页数:14
相关论文
共 50 条
  • [31] FreeKD: Free-direction Knowledge Distillation for Graph Neural Networks
    Feng, Kaituo
    Li, Changsheng
    Yuan, Ye
    Wang, Guoren
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 357 - 366
  • [32] Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework
    Yang, Cheng
    Liu, Jiawei
    Shi, Chuan
    PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 1227 - 1237
  • [33] Training Large-Scale Graph Neural Networks via Graph Partial Pooling
    Zhang, Qi
    Sun, Yanfeng
    Wang, Shaofan
    Gao, Junbin
    Hu, Yongli
    Yin, Baocai
    IEEE TRANSACTIONS ON BIG DATA, 2025, 11 (01) : 221 - 233
  • [34] A Graph Neural Network Approach for Scalable Wireless Power Control
    Shen, Yifei
    Shi, Yuanming
    Zhang, Jun
    Letaief, Khaled B.
    2019 IEEE GLOBECOM WORKSHOPS (GC WKSHPS), 2019,
  • [35] PropInit: Scalable Inductive Initialization for Heterogeneous Graph Neural Networks
    Adeshina, Soji
    Zhang, Jian
    Kim, Muhyun
    Chen, Min
    Fathony, Rizal
    Vashisht, Advitiya
    Chen, Jia
    Karypis, George
    2022 IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH (ICKG), 2022, : 6 - 13
  • [36] Scalable algorithms for physics-informed neural and graph networks
    Shukla, Khemraj
    Xu, Mengjia
    Trask, Nathaniel
    Karniadakis, George E.
    DATA-CENTRIC ENGINEERING, 2022, 3
  • [37] Hyperbolic Graph Wavelet Neural Network
    Zheng, Wenjie
    Zhang, Guofeng
    Zhao, Xiaoran
    Feng, Zhikang
    Song, Lekang
    Kou, Huaizhen
    TSINGHUA SCIENCE AND TECHNOLOGY, 2025, 30 (04): : 1511 - 1525
  • [38] Knowledge Enhanced Graph Neural Networks for Explainable Recommendation
    Lyu, Ziyu
    Wu, Yue
    Lai, Junjie
    Yang, Min
    Li, Chengming
    Zhou, Wei
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (05) : 4954 - 4968
  • [39] DCCD: Reducing Neural Network Redundancy via Distillation
    Liu, Yuang
    Chen, Jun
    Liu, Yong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (07) : 10006 - 10017
  • [40] PEAE-GNN: Phishing Detection on Ethereum via Augmentation Ego-Graph Based on Graph Neural Network
    Huang, Hexiang
    Zhang, Xuan
    Wang, Jishu
    Gao, Chen
    Li, Xue
    Zhu, Rui
    Ma, Qiuying
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, 11 (03) : 4326 - 4339