Propagation Enhanced Neural Message Passing for Graph Representation Learning

被引:20
|
作者
Fan, Xiaolong [1 ]
Gong, Maoguo [1 ]
Wu, Yue [2 ]
Qin, A. K. [3 ]
Xie, Yu [4 ]
机构
[1] Xidian Univ, Sch Elect Engn, Key Lab Intelligent Percept, Image Understanding,Minist Educ, Xian 710126, Shaanxi, Peoples R China
[2] Xidian Univ, Sch Comp Sci & Technol, Xian 710126, Shaanxi, Peoples R China
[3] Swinburne Univ Technol, Dept Comp Technol, Melbourne, VIC 3122, Australia
[4] Shanxi Univ, Key Lab Computat Intelligence & Chinese Informat P, Minist Educ, Taiyuan 030006, Peoples R China
基金
中国国家自然科学基金; 澳大利亚研究理事会;
关键词
Message passing; Aggregates; Task analysis; Data models; Predictive models; Graph neural networks; Adaptation models; Graph data mining; graph representation learning; graph neural network; neural message passing; NETWORK;
D O I
10.1109/TKDE.2021.3102964
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Neural Network (GNN) is capable of applying deep neural networks to graph domains. Recently, Message Passing Neural Networks (MPNNs) have been proposed to generalize several existing graph neural networks into a unified framework. For graph representation learning, MPNNs first generate discriminative node representations using the message passing function and then read from the node representation space to generate a graph representation using the readout function. In this paper, we analyze the representation capacity of the MPNNs for aggregating graph information and observe that the existing approaches ignore the self-loop for graph representation learning, leading to limited representation capacity. To alleviate this issue, we introduce a simple yet effective propagation enhanced extension, Self-Connected Neural Message Passing (SC-NMP), which aggregates the node representations of the current step and the graph representation of the previous step. To further improve the information flow, we also propose a Densely Self-Connected Neural Message Passing (DSC-NMP) that connects each layer to every other layer in a feed-forward fashion. Both proposed architectures are applied at each layer and the graph representation can then be used as input into all subsequent layers. Remarkably, combining these two architectures with existing GNN variants can improve these models' performance for graph representation learning. Extensive experiments on various benchmark datasets strongly demonstrate the effectiveness, leading to superior performance for graph classification and regression tasks.
引用
收藏
页码:1952 / 1964
页数:13
相关论文
共 50 条
  • [1] Generative and contrastive graph representation learning with message passing
    Tang, Ying
    Yang, Yining
    Sun, Guodao
    NEURAL NETWORKS, 2025, 185
  • [2] Learning Graph Distances with Message Passing Neural Networks
    Riba, Pau
    Fischer, Andreas
    Llados, Josep
    Fornes, Alicia
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2239 - 2244
  • [3] Cross Message Passing Graph Neural Network
    Zhang, Zeyu
    Liu, Zheng
    Zhou, Qiyun
    Qu, Yanwen
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [4] Neural Enhanced Dynamic Message Passing
    Gao, Fei
    Zhang, Yan
    Zhang, Jiang
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [5] GAN-based self-supervised message passing graph representation learning
    Yang, Yining
    Xu, Ke
    Tang, Ying
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 251
  • [6] Approximate Message Passing-Enhanced Graph Neural Network for OTFS Data Detection
    Zhuang, Wenhao
    Mao, Yuyi
    He, Hengtao
    Xie, Lei
    Song, Shenghui
    Ge, Yao
    Ding, Zhi
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2024, 13 (07) : 1913 - 1917
  • [7] Breaking the Limits of Message Passing Graph Neural Networks
    Balcilar, Muhammet
    Heroux, Pierre
    Gauzere, Benoit
    Vasseur, Pascal
    Adam, Sebastien
    Honeine, Paul
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [8] Revisiting the Message Passing in Heterophilous Graph Neural Networks
    Zheng, Zhuonan
    Bei, Yuanchen
    Zhou, Sheng
    Ma, Yao
    Gu, Ming
    Xu, Hongjia
    Lai, Chengyu
    Chen, Jiawei
    Bu, Jiajun
    arXiv, 2024,
  • [9] Polarized message-passing in graph neural networks
    He, Tiantian
    Liu, Yang
    Ong, Yew-Soon
    Wu, Xiaohu
    Luo, Xin
    ARTIFICIAL INTELLIGENCE, 2024, 331
  • [10] Hierarchical message-passing graph neural networks
    Zhong, Zhiqiang
    Li, Cheng-Te
    Pang, Jun
    DATA MINING AND KNOWLEDGE DISCOVERY, 2023, 37 (01) : 381 - 408