Learning Prioritized Node-Wise Message Propagation in Graph Neural Networks

被引:1
|
作者
Cheng, Yao [1 ]
Chen, Minjie [1 ]
Shan, Caihua [2 ]
Li, Xiang [1 ]
机构
[1] East China Normal Univ, Shanghai 200062, Peoples R China
[2] Microsoft Res Asia, Shanghai 200232, Peoples R China
基金
中国国家自然科学基金;
关键词
Classification; graph heterophily; graph neural networks; representation learning;
D O I
10.1109/TKDE.2024.3436909
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNNs) have recently received significant attention. Learning node-wise message propagation in GNNs aims to set personalized propagation steps for different nodes in the graph. Despite the success, existing methods ignore node priority that can be reflected by node influence and heterophily. In this paper, we propose a versatile framework PriPro, which can be integrated with most existing GNN models and aim to learn prioritized node-wise message propagation in GNNs. Specifically, the framework consists of three components: a backbone GNN model, a propagation controller to determine the optimal propagation steps for nodes, and a weight controller to compute the priority scores for nodes. We design a mutually enhanced mechanism to compute node priority, optimal propagation step and label prediction. We also propose an alternative optimization strategy to learn the parameters in the backbone GNN model and two parametric controllers. We conduct extensive experiments to compare our framework with other 12 state-of-the-art competitors on 10 benchmark datasets. Experimental results show that our framework can lead to superior performance in terms of propagation strategies and node representations.
引用
收藏
页码:8670 / 8681
页数:12
相关论文
共 50 条
  • [21] Iterative Deep Graph Learning for Graph Neural Networks: Better and Robust Node Embeddings
    Chen, Yu
    Wu, Lingfei
    Zaki, Mohammed J.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [22] Efficient Learning of Linear Graph Neural Networks via Node Subsampling
    Shin, Seiyun
    Shomorony, Ilan
    Zhao, Han
    Advances in Neural Information Processing Systems, 2023, 36 : 55479 - 55501
  • [23] Hierarchical Representation Learning in Graph Neural Networks With Node Decimation Pooling
    Bianchi, Filippo Maria
    Grattarola, Daniele
    Livi, Lorenzo
    Alippi, Cesare
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (05) : 2195 - 2207
  • [24] Efficient Learning of Linear Graph Neural Networks via Node Subsampling
    Shin, Seiyun
    Shomorony, Ilan
    Zhao, Han
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [25] Label Propagation and Graph Neural Networks
    Benson, Austin
    2021 IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM WORKSHOPS (IPDPSW), 2021, : 241 - 241
  • [26] Long-tailed graph neural networks via graph structure learning for node classification
    Junchao Lin
    Yuan Wan
    Jingwen Xu
    Xingchen Qi
    Applied Intelligence, 2023, 53 : 20206 - 20222
  • [27] Long-tailed graph neural networks via graph structure learning for node classification
    Lin, Junchao
    Wan, Yuan
    Xu, Jingwen
    Qi, Xingchen
    APPLIED INTELLIGENCE, 2023, 53 (17) : 20206 - 20222
  • [28] Lifelong Learning of Graph Neural Networks for Open-World Node Classification
    Galke, Lukas
    Franke, Benedikt
    Zielke, Tobias
    Scherp, Ansgar
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [29] Distributed Filtering for Complex Networks Under Multiple Event-Triggered Transmissions Within Node-Wise Communications
    Liu, Yang
    Wang, Zidong
    Zou, Lei
    Hu, Jun
    Dong, Hongli
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2022, 9 (04): : 2521 - 2534
  • [30] Transition Propagation Graph Neural Networks for Temporal Networks
    Zheng, Tongya
    Feng, Zunlei
    Zhang, Tianli
    Hao, Yunzhi
    Song, Mingli
    Wang, Xingen
    Wang, Xinyu
    Zhao, Ji
    Chen, Chun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4567 - 4579