Propagation Enhanced Neural Message Passing for Graph Representation Learning

被引:20
|
作者
Fan, Xiaolong [1 ]
Gong, Maoguo [1 ]
Wu, Yue [2 ]
Qin, A. K. [3 ]
Xie, Yu [4 ]
机构
[1] Xidian Univ, Sch Elect Engn, Key Lab Intelligent Percept, Image Understanding,Minist Educ, Xian 710126, Shaanxi, Peoples R China
[2] Xidian Univ, Sch Comp Sci & Technol, Xian 710126, Shaanxi, Peoples R China
[3] Swinburne Univ Technol, Dept Comp Technol, Melbourne, VIC 3122, Australia
[4] Shanxi Univ, Key Lab Computat Intelligence & Chinese Informat P, Minist Educ, Taiyuan 030006, Peoples R China
基金
中国国家自然科学基金; 澳大利亚研究理事会;
关键词
Message passing; Aggregates; Task analysis; Data models; Predictive models; Graph neural networks; Adaptation models; Graph data mining; graph representation learning; graph neural network; neural message passing; NETWORK;
D O I
10.1109/TKDE.2021.3102964
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Neural Network (GNN) is capable of applying deep neural networks to graph domains. Recently, Message Passing Neural Networks (MPNNs) have been proposed to generalize several existing graph neural networks into a unified framework. For graph representation learning, MPNNs first generate discriminative node representations using the message passing function and then read from the node representation space to generate a graph representation using the readout function. In this paper, we analyze the representation capacity of the MPNNs for aggregating graph information and observe that the existing approaches ignore the self-loop for graph representation learning, leading to limited representation capacity. To alleviate this issue, we introduce a simple yet effective propagation enhanced extension, Self-Connected Neural Message Passing (SC-NMP), which aggregates the node representations of the current step and the graph representation of the previous step. To further improve the information flow, we also propose a Densely Self-Connected Neural Message Passing (DSC-NMP) that connects each layer to every other layer in a feed-forward fashion. Both proposed architectures are applied at each layer and the graph representation can then be used as input into all subsequent layers. Remarkably, combining these two architectures with existing GNN variants can improve these models' performance for graph representation learning. Extensive experiments on various benchmark datasets strongly demonstrate the effectiveness, leading to superior performance for graph classification and regression tasks.
引用
收藏
页码:1952 / 1964
页数:13
相关论文
共 50 条
  • [21] BeMap: Balanced Message Passing for Fair Graph Neural Network
    Lin, Xiao
    Kang, Jian
    Cong, Weilin
    Tong, Hanghang
    LEARNING ON GRAPHS CONFERENCE, VOL 231, 2023, 231
  • [22] Domain-adaptive message passing graph neural network
    Shen, Xiao
    Pan, Shirui
    Choi, Kup-Sze
    Zhou, Xi
    NEURAL NETWORKS, 2023, 164 : 439 - 454
  • [23] Redundancy-Free Message Passing for Graph Neural Networks
    Chen, Rongqin
    Zhang, Shenghui
    Hou, Leong U.
    Li, Ye
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [24] Hierarchical Graph Neural Network: A Lightweight Image Matching Model with Enhanced Message Passing of Local and Global Information in Hierarchical Graph Neural Networks
    Gyamfi, Enoch Opanin
    Qin, Zhiguang
    Danso, Juliana Mantebea
    Adu-Gyamfi, Daniel
    INFORMATION, 2024, 15 (10)
  • [25] Learning Attributed Graph Representations with Communicative Message Passing Transformer
    Chen, Jianwen
    Zheng, Shuangjia
    Song, Ying
    Rao, Jiahua
    Yang, Yuedong
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2242 - 2248
  • [26] Adaptive Neural Message Passing for Inductive Learning on Hypergraphs
    Arya, Devanshu
    Gupta, Deepak K.
    Rudinac, Stevan
    Worring, Marcel
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2025, 47 (01) : 19 - 31
  • [27] Graph neural architecture search with heterogeneous message-passing mechanisms
    Wang, Yili
    Chen, Jiamin
    Li, Qiutong
    He, Changlong
    Gao, Jianliang
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (07) : 4283 - 4308
  • [28] Addressing data association by message passing over graph neural networks
    Tedeschini, Bernardo Camajori
    Brambilla, Mattia
    Barbieri, Luca
    Nicoli, Monica
    2022 25TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION 2022), 2022,
  • [29] Rethinking Graph Neural Architecture Search from Message-passing
    Cai, Shaofei
    Li, Liang
    Deng, Jincan
    Zhang, Beichen
    Zha, Zheng-Jun
    Su, Li
    Huang, Qingming
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 6653 - 6662
  • [30] Contrastive message passing for robust graph neural networks with sparse labels
    Yan, Hui
    Gao, Yuan
    Ai, Guoguo
    Wang, Huan
    Li, Xin
    NEURAL NETWORKS, 2025, 182