Enhanced Scalable Graph Neural Network via Knowledge Distillation

被引:0
作者
Mai, Chengyuan [1 ,2 ]
Chang, Yaomin [1 ,2 ]
Chen, Chuan [1 ,2 ]
Zheng, Zibin [2 ,3 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[2] Sun Yat Sen Univ, Natl Engn Res Ctr Digital Life, Guangzhou 510006, Peoples R China
[3] Sun Yat Sen Univ, Sch Software Engn, Zhuhai 519000, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph neural networks; Scalability; Computational modeling; Convolution; Training; Spectral analysis; Data mining; Graph neural network (GNN); knowledge distillation (KD); network embedding; scalability;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNNs) have achieved state-of-the-art performance in various graph representation learning scenarios. However, when applied to graph data in real world, GNNs have encountered scalability issues. Existing GNNs often have high computational load in both training and inference stages, making them incapable of meeting the performance needs of large-scale scenarios with a large number of nodes. Although several studies on scalable GNNs have developed, they either merely improve GNNs with limited scalability or come at the expense of reduced effectiveness. Inspired by knowledge distillation's (KDs) achievement in preserving performances while balancing scalability in computer vision and natural language processing, we propose an enhanced scalable GNN via KD (KD-SGNN) to improve the scalability and effectiveness of GNNs. On the one hand, KD-SGNN adopts the idea of decoupled GNNs, which decouples feature transformation and feature propagation in GNNs and leverages preprocessing techniques to improve the scalability of GNNs. On the other hand, KD-SGNN proposes two KD mechanisms (i.e., soft-target (ST) distillation and shallow imitation (SI) distillation) to improve the expressiveness. The scalability and effectiveness of KD-SGNN are evaluated on multiple real datasets. Besides, the effectiveness of the proposed KD mechanisms is also verified through comprehensive analyses.
引用
收藏
页码:1258 / 1271
页数:14
相关论文
共 50 条
  • [41] Knowledge Graph Double Interaction Graph Neural Network for Recommendation Algorithm
    Kang, Shuang
    Shi, Lin
    Zhang, Zhenyou
    APPLIED SCIENCES-BASEL, 2022, 12 (24):
  • [42] Multimodal Decoupled Distillation Graph Neural Network for Emotion Recognition in Conversation
    Dai, Yijing
    Li, Yingjian
    Chen, Dongpeng
    Li, Jinxing
    Lu, Guangming
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (10) : 9910 - 9924
  • [43] SGCNAX: A Scalable Graph Convolutional Neural Network Accelerator With Workload Balancing
    Li, Jiajun
    Zheng, Hao
    Wang, Ke
    Louri, Ahmed
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (11) : 2834 - 2845
  • [44] Graph distillation with network symmetry
    Lin, Feng
    He, Jia-Lin
    CHINESE PHYSICS B, 2025, 34 (04)
  • [45] SDG: A Simplified and Dynamic Graph Neural Network
    Fu, Dongqi
    He, Jingrui
    SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, : 2273 - 2277
  • [46] Network Controllability Robustness Learning via Spatial Graph Neural Networks
    Zhang, Yu
    Ding, Jie
    Li, Xiang
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (05): : 4045 - 4058
  • [47] Enhanced Signed Graph Neural Network with Node Polarity
    Chen, Jiawang
    Qiao, Zhi
    Yan, Jun
    Wu, Zhenqiang
    ENTROPY, 2023, 25 (01)
  • [48] SCALABLE COMPLEX GRAPH ANALYSIS WITH THE KNOWLEDGE DISCOVERY TOOLBOX
    Lugowski, Adam
    Buluc, Aydin
    Gilbert, John R.
    Reinhardt, Steve
    2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 5345 - 5348
  • [49] Graph Neural Networks for Knowledge Enhanced Visual Representation of Paintings
    Efthymiou, Athanasios
    Rudinac, Stevan
    Kackovic, Monika
    Worring, Marcel
    Wijnberg, Nachoem
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 3710 - 3719
  • [50] A Teacher-Free Graph Knowledge Distillation Framework With Dual Self-Distillation
    Wu, Lirong
    Lin, Haitao
    Gao, Zhangyang
    Zhao, Guojiang
    Li, Stan Z.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (09) : 4375 - 4385