Propagation Enhanced Neural Message Passing for Graph Representation Learning

被引:20
|
作者
Fan, Xiaolong [1 ]
Gong, Maoguo [1 ]
Wu, Yue [2 ]
Qin, A. K. [3 ]
Xie, Yu [4 ]
机构
[1] Xidian Univ, Sch Elect Engn, Key Lab Intelligent Percept, Image Understanding,Minist Educ, Xian 710126, Shaanxi, Peoples R China
[2] Xidian Univ, Sch Comp Sci & Technol, Xian 710126, Shaanxi, Peoples R China
[3] Swinburne Univ Technol, Dept Comp Technol, Melbourne, VIC 3122, Australia
[4] Shanxi Univ, Key Lab Computat Intelligence & Chinese Informat P, Minist Educ, Taiyuan 030006, Peoples R China
基金
中国国家自然科学基金; 澳大利亚研究理事会;
关键词
Message passing; Aggregates; Task analysis; Data models; Predictive models; Graph neural networks; Adaptation models; Graph data mining; graph representation learning; graph neural network; neural message passing; NETWORK;
D O I
10.1109/TKDE.2021.3102964
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Neural Network (GNN) is capable of applying deep neural networks to graph domains. Recently, Message Passing Neural Networks (MPNNs) have been proposed to generalize several existing graph neural networks into a unified framework. For graph representation learning, MPNNs first generate discriminative node representations using the message passing function and then read from the node representation space to generate a graph representation using the readout function. In this paper, we analyze the representation capacity of the MPNNs for aggregating graph information and observe that the existing approaches ignore the self-loop for graph representation learning, leading to limited representation capacity. To alleviate this issue, we introduce a simple yet effective propagation enhanced extension, Self-Connected Neural Message Passing (SC-NMP), which aggregates the node representations of the current step and the graph representation of the previous step. To further improve the information flow, we also propose a Densely Self-Connected Neural Message Passing (DSC-NMP) that connects each layer to every other layer in a feed-forward fashion. Both proposed architectures are applied at each layer and the graph representation can then be used as input into all subsequent layers. Remarkably, combining these two architectures with existing GNN variants can improve these models' performance for graph representation learning. Extensive experiments on various benchmark datasets strongly demonstrate the effectiveness, leading to superior performance for graph classification and regression tasks.
引用
收藏
页码:1952 / 1964
页数:13
相关论文
共 50 条
  • [31] Are Message Passing Neural Networks Really Helpful for Knowledge Graph Completion?
    Li, Juanhui
    Shomer, Harry
    Ding, Jiayuan
    Wang, Yiqi
    Ma, Yao
    Shah, Neil
    Tang, Jiliang
    Yin, Dawei
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 10696 - 10711
  • [32] Graph Neural Network with Virtual Edge Message Passing for Heterophilous Graphs
    Sun, Chengcheng
    Niu, Qiang
    Rui, Xiaobin
    Wang, Zhixiao
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [33] ATTRIBUTE PROPAGATION BY MESSAGE PASSING
    DEMERS, A
    ROGERS, A
    ZADECK, FK
    SIGPLAN NOTICES, 1985, 20 (07): : 43 - 59
  • [34] Spatio-Temporal Propagation: An Extended Message-Passing Graph Neural Network for Remaining Useful Life Prediction
    Kong, Ziqian
    Jin, Xiaohang
    Wang, Feng
    Xu, Zhengguo
    IEEE SENSORS JOURNAL, 2024, 24 (20) : 32468 - 32479
  • [35] Deep learning via message passing algorithms based on belief propagation
    Lucibello, Carlo
    Pittorino, Fabrizio
    Perugini, Gabriele
    Zecchina, Riccardo
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2022, 3 (03):
  • [36] Wasserstein Barycenter Matching for Graph Size Generalization of Message Passing Neural Networks
    Chu, Xu
    Jin, Yujie
    Wang, Xin
    Zhang, Shanghang
    Wang, Yasha
    Zhu, Wenwu
    Mei, Hong
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [37] Dynamic Graph Message Passing Networks
    Zhang, Li
    Chen, Mohan
    Arnab, Anurag
    Xue, Xiangyang
    Torr, Philip H. S.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (05) : 5712 - 5730
  • [38] Dynamic Graph Message Passing Networks
    Zhang, Li
    Xu, Dan
    Arnab, Anurag
    Torr, Philip H. S.
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 3723 - 3732
  • [39] PowerFlowNet: Power flow approximation using message passing Graph Neural Networks
    Lin, Nan
    Orfanoudakis, Stavros
    Cardenas, Nathan Ordonez
    Giraldo, Juan S.
    Vergara, Pedro P.
    INTERNATIONAL JOURNAL OF ELECTRICAL POWER & ENERGY SYSTEMS, 2024, 160
  • [40] How Powerful are K-hop Message Passing Graph Neural Networks
    Feng, Jiarui
    Chen, Yixin
    Li, Fuhai
    Sarkar, Anindya
    Zhang, Muhan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,