Enhancing Graph Neural Networks via Memorized Global Information

被引:0
|
作者
Zeng, Ruihong [1 ]
Fang, Jinyuan [2 ]
Liu, Siwei [3 ]
Meng, Zaiqiao [4 ]
Liang, Shangsong [5 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou, Guangdong, Peoples R China
[2] Sun Yat Sen Univ, Guangzhou, Peoples R China
[3] Mohamed bin Zayed Univ Artificial Intelligence Mas, Dept Machine Learning, Abu Dhabi, U Arab Emirates
[4] Univ Glasgow, Glasgow City, Scotland
[5] Sun Yat Sen Univ, Sch Data & Comp Sci, Guangzhou, Peoples R China
关键词
Network embedding; graph neural network; memorized global information;
D O I
10.1145/3689430
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Graph neural networks (GNNs) have gained significant attention for their impressive results on different graph-based tasks. The essential mechanism of GNNs is the message-passing framework, whereby node representations are aggregated from local neighborhoods. Recently, Transformer-based GNNs have been introduced to learn the long-range dependencies, enhancing performance. However, their quadratic computational complexity, due to the attention computation, has constrained their applicability on large-scale graphs. To address this issue, we propose MGIGNN (Memorized G lobal I nformation G raph N eural N etwork), an innovative approach that leverages memorized global information to enhance existing GNNs in both transductive and inductive scenarios. Specifically, MGIGNN captures long-range dependencies by identifying and incorporating global similar nodes, which are defined as nodes exhibiting similar features, structural patterns and label information within a graph. To alleviate the computational overhead associated with computing embeddings for all nodes, we introduce an external memory module to facilitate the retrieval of embeddings and optimize performance on large graphs. To enhance the memory-efficiency, MGIGNN selectively retrieves global similar nodes from a small set of candidate nodes. These candidate nodes are selected from the training nodes based on a sparse node selection distribution with a Dirichlet prior. This selecting approach not only reduces the memory size required but also ensures efficient utilization of computational resources. Through comprehensive experiments conducted on ten widely-used and real-world datasets, including seven homogeneous datasets and three heterogeneous datasets, we demonstrate that our MGIGNN can generally improve the performance of existing GNNs on node classification tasks under both inductive and transductive settings.
引用
收藏
页数:34
相关论文
共 50 条
  • [1] Pre-Train and Learn: Preserving Global Information for Graph Neural Networks
    Zhu, Dan-Hao
    Dai, Xin-Yu
    Chen, Jia-Jun
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2021, 36 (06) : 1420 - 1430
  • [2] Pre-Train and Learn: Preserving Global Information for Graph Neural Networks
    Dan-Hao Zhu
    Xin-Yu Dai
    Jia-Jun Chen
    Journal of Computer Science and Technology, 2021, 36 : 1420 - 1430
  • [3] Information Diffusion Prediction via Dynamic Graph Neural Networks
    Cao, Zongmai
    Han, Kai
    Zhu, Jianfu
    PROCEEDINGS OF THE 2021 IEEE 24TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN (CSCWD), 2021, : 1099 - 1104
  • [4] Relation Prediction via Graph Neural Network in Heterogeneous Information Networks with Missing Type Information
    Zhang, Han
    Hao, Yu
    Cao, Xin
    Fang, Yixiang
    Shin, Won-Yong
    Wang, Wei
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 2517 - 2526
  • [5] Graph Information Vanishing Phenomenon in Implicit Graph Neural Networks
    He, Silu
    Cao, Jun
    Yuan, Hongyuan
    Chen, Zhe
    Gao, Shijuan
    Li, Haifeng
    MATHEMATICS, 2024, 12 (17)
  • [6] Imbalanced Graph Classification via Graph-of-Graph Neural Networks
    Wang, Yu
    Zhao, Yuying
    Shah, Neil
    Derr, Tyler
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 2068 - 2077
  • [7] InsGNN: Interpretable spatio-temporal graph neural networks via information bottleneck
    Fang, Hui
    Wang, Haishuai
    Gao, Yang
    Zhang, Yonggang
    Bu, Jiajun
    Han, Bo
    Lin, Hui
    INFORMATION FUSION, 2025, 119
  • [8] A Privacy-Enhancing Mechanism for Federated Graph Neural Networks
    Tang, Xuebin
    Hu, Feng
    SYMMETRY-BASEL, 2025, 17 (04):
  • [9] Distance Information Improves Heterogeneous Graph Neural Networks
    Shi, Chuan
    Ji, Houye
    Lu, Zhiyuan
    Tang, Ye
    Li, Pan
    Yang, Cheng
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (03) : 1030 - 1043
  • [10] Predicting Clinical Events via Graph Neural Networks
    Kanchinadam, Teja
    Gauher, Shaheen
    2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 1296 - 1303