Propagation Enhanced Neural Message Passing for Graph Representation Learning

被引:20
|
作者
Fan, Xiaolong [1 ]
Gong, Maoguo [1 ]
Wu, Yue [2 ]
Qin, A. K. [3 ]
Xie, Yu [4 ]
机构
[1] Xidian Univ, Sch Elect Engn, Key Lab Intelligent Percept, Image Understanding,Minist Educ, Xian 710126, Shaanxi, Peoples R China
[2] Xidian Univ, Sch Comp Sci & Technol, Xian 710126, Shaanxi, Peoples R China
[3] Swinburne Univ Technol, Dept Comp Technol, Melbourne, VIC 3122, Australia
[4] Shanxi Univ, Key Lab Computat Intelligence & Chinese Informat P, Minist Educ, Taiyuan 030006, Peoples R China
基金
中国国家自然科学基金; 澳大利亚研究理事会;
关键词
Message passing; Aggregates; Task analysis; Data models; Predictive models; Graph neural networks; Adaptation models; Graph data mining; graph representation learning; graph neural network; neural message passing; NETWORK;
D O I
10.1109/TKDE.2021.3102964
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Neural Network (GNN) is capable of applying deep neural networks to graph domains. Recently, Message Passing Neural Networks (MPNNs) have been proposed to generalize several existing graph neural networks into a unified framework. For graph representation learning, MPNNs first generate discriminative node representations using the message passing function and then read from the node representation space to generate a graph representation using the readout function. In this paper, we analyze the representation capacity of the MPNNs for aggregating graph information and observe that the existing approaches ignore the self-loop for graph representation learning, leading to limited representation capacity. To alleviate this issue, we introduce a simple yet effective propagation enhanced extension, Self-Connected Neural Message Passing (SC-NMP), which aggregates the node representations of the current step and the graph representation of the previous step. To further improve the information flow, we also propose a Densely Self-Connected Neural Message Passing (DSC-NMP) that connects each layer to every other layer in a feed-forward fashion. Both proposed architectures are applied at each layer and the graph representation can then be used as input into all subsequent layers. Remarkably, combining these two architectures with existing GNN variants can improve these models' performance for graph representation learning. Extensive experiments on various benchmark datasets strongly demonstrate the effectiveness, leading to superior performance for graph classification and regression tasks.
引用
收藏
页码:1952 / 1964
页数:13
相关论文
共 50 条
  • [41] Neural message passing and spectral graph transformations for property prediction of monomers and polymers
    Wilson, Nolan
    St John, Peter
    Nimlos, Mark
    Crowley, Michael
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2018, 256
  • [42] Proximity Graph Networks: Predicting Ligand Affinity with Message Passing Neural Networks
    Gale-Day, Zachary J.
    Shub, Laura
    Chuang, Kangway V.
    Keiser, Michael J.
    JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2024, 64 (14) : 5439 - 5450
  • [43] A Comparative Study of Marginalized Graph Kernel and Message-Passing Neural Network
    Xiang, Yan
    Tang, Yu-Hang
    Lin, Guang
    Sun, Huai
    JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2021, 61 (11) : 5414 - 5424
  • [44] Understanding the message passing in graph neural networks via power iteration clustering
    Li, Xue
    Cheng, Yuanzhi
    NEURAL NETWORKS, 2021, 140 : 130 - 135
  • [45] Geodesic Graph Neural Network for Efficient Graph Representation Learning
    Kong, Lecheng
    Chen, Yixin
    Zhang, Muhan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [46] Propagation Graph Representation Learning and Its Implementation in Direct Path Representation
    Suto, Katsuya
    Bannai, Shinsuke
    Sato, Koya
    Fujii, Takeo
    2023 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC, 2023,
  • [47] Contrastive learning enhanced by graph neural networks for Universal Multivariate Time Series Representation
    Wang, Xinghao
    Xing, Qiang
    Xiao, Huimin
    Ye, Ming
    INFORMATION SYSTEMS, 2024, 125
  • [48] Text-Graph Enhanced Knowledge Graph Representation Learning
    Hu, Linmei
    Zhang, Mengmei
    Li, Shaohua
    Shi, Jinghan
    Shi, Chuan
    Yang, Cheng
    Liu, Zhiyuan
    FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2021, 4
  • [49] Contrastive learning enhanced by graph neural networks for Universal Multivariate Time Series Representation
    College of Artificial Intelligence, Southwest University, Chongqing
    400715, China
    Inf. Syst.,
  • [50] Vulnerability Detection with Graph Simplification and Enhanced Graph Representation Learning
    Wen, Xin-Cheng
    Chen, Yupan
    Gao, Cuiyun
    Zhang, Hongyu
    Zhang, Jie M.
    Liao, Qing
    2023 IEEE/ACM 45TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, ICSE, 2023, : 2275 - 2286