Enhanced Scalable Graph Neural Network via Knowledge Distillation

被引:0
作者
Mai, Chengyuan [1 ,2 ]
Chang, Yaomin [1 ,2 ]
Chen, Chuan [1 ,2 ]
Zheng, Zibin [2 ,3 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[2] Sun Yat Sen Univ, Natl Engn Res Ctr Digital Life, Guangzhou 510006, Peoples R China
[3] Sun Yat Sen Univ, Sch Software Engn, Zhuhai 519000, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph neural networks; Scalability; Computational modeling; Convolution; Training; Spectral analysis; Data mining; Graph neural network (GNN); knowledge distillation (KD); network embedding; scalability;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNNs) have achieved state-of-the-art performance in various graph representation learning scenarios. However, when applied to graph data in real world, GNNs have encountered scalability issues. Existing GNNs often have high computational load in both training and inference stages, making them incapable of meeting the performance needs of large-scale scenarios with a large number of nodes. Although several studies on scalable GNNs have developed, they either merely improve GNNs with limited scalability or come at the expense of reduced effectiveness. Inspired by knowledge distillation's (KDs) achievement in preserving performances while balancing scalability in computer vision and natural language processing, we propose an enhanced scalable GNN via KD (KD-SGNN) to improve the scalability and effectiveness of GNNs. On the one hand, KD-SGNN adopts the idea of decoupled GNNs, which decouples feature transformation and feature propagation in GNNs and leverages preprocessing techniques to improve the scalability of GNNs. On the other hand, KD-SGNN proposes two KD mechanisms (i.e., soft-target (ST) distillation and shallow imitation (SI) distillation) to improve the expressiveness. The scalability and effectiveness of KD-SGNN are evaluated on multiple real datasets. Besides, the effectiveness of the proposed KD mechanisms is also verified through comprehensive analyses.
引用
收藏
页码:1258 / 1271
页数:14
相关论文
共 50 条
  • [21] Graph Convolutional Neural Network for Intelligent Fault Diagnosis of Machines via Knowledge Graph
    Mao, Zehui
    Wang, Huan
    Jiang, Bin
    Xu, Juan
    Guo, Huifeng
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (05) : 7862 - 7870
  • [22] Multiresolution Reservoir Graph Neural Network
    Pasa, Luca
    Navarin, Nicolo
    Sperduti, Alessandro
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (06) : 2642 - 2653
  • [23] SGKD: A Scalable and Effective Knowledge Distillation Framework for Graph Representation Learning
    He, Yufei
    Ma, Yao
    2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW, 2022, : 666 - 673
  • [24] Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression
    Blakeney, Cody
    Li, Xiaomin
    Yan, Yan
    Zong, Ziliang
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2021, 32 (07) : 1765 - 1776
  • [25] Knowledge Distillation with Graph Neural Networks for Epileptic Seizure Detection
    Zheng, Qinyue
    Venkitaraman, Arun
    Petravic, Simona
    Frossard, Pascal
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: APPLIED DATA SCIENCE AND DEMO TRACK, ECML PKDD 2023, PT VI, 2023, 14174 : 547 - 563
  • [26] GRAND plus : Scalable Graph Random Neural Networks
    Feng, Wenzheng
    Dong, Yuxiao
    Huang, Tinglin
    Yin, Ziqi
    Cheng, Xu
    Kharlamov, Evgeny
    Tang, Jie
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 3248 - 3258
  • [27] When Pansharpening Meets Graph Convolution Network and Knowledge Distillation
    Yan, Keyu
    Zhou, Man
    Liu, Liu
    Xie, Chengjun
    Hong, Danfeng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [28] Revisiting Graph based Social Recommendation: A Distillation Enhanced Social Graph Network
    Tao, Ye
    Li, Ying
    Zhang, Su
    Hou, Zhirong
    Wu, Zhonghai
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 2830 - 2838
  • [29] Narrow the Input Mismatch in Deep Graph Neural Network Distillation
    Zhou, Qiqi
    Shen, Yanyan
    Chen, Lei
    PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 3581 - 3592
  • [30] GSGNet-S∗: Graph Semantic Guidance Network via Knowledge Distillation for Optical Remote Sensing Image Scene Analysis
    Zhou, Wujie
    Li, Yangzhen
    Huang, Juan
    Yan, Weiqing
    Fang, Meixin
    Jiang, Qiuping
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61 : 1 - 12