Extending the Design Space of Graph Neural Networks by Rethinking Folklore Weisfeiler-Lehman

被引:0
|
作者
Feng, Jiarui [1 ]
Kong, Lecheng [1 ]
Liu, Hao [1 ]
Tao, Dacheng [2 ]
Li, Fuhai [1 ]
Zhang, Muhan [3 ]
Chen, Yixin [1 ]
机构
[1] Washington Univ, St Louis, MO 63110 USA
[2] JD Explore Acad, Beijing, Peoples R China
[3] Peking Univ, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Message passing neural networks (MPNNs) have emerged as the most popular framework of graph neural networks (GNNs) in recent years. However, their expressive power is limited by the 1-dimensional Weisfeiler-Lehman (1-WL) test. Some works are inspired by k-WL/FWL (Folklore WL) and design the corresponding neural versions. Despite the high expressive power, there are serious limitations in this line of research. In particular, (1) k-WL/FWL requires at least O(n(k)) space complexity, which is impractical for large graphs even when k = 3; (2) The design space of k-WL/FWL is rigid, with the only adjustable hyper-parameter being k. To tackle the first limitation, we propose an extension, (k, t)-FWL. We theoretically prove that even if we fix the space complexity to O(n(k)) (for any k >= 2) in (k, t)-FWL, we can construct an expressiveness hierarchy up to solving the graph isomorphism problem. To tackle the second problem, we propose k-FWL+, which considers any equivariant set as neighbors instead of all nodes, thereby greatly expanding the design space of k-FWL. Combining these two modifications results in a flexible and powerful framework (k, t)-FWL+. We demonstrate (k, t)-FWL+ can implement most existing models with matching expressiveness. We then introduce an instance of (k, t)-FWL+ called Neighborhood(2)-FWL (N-2-FWL), which is practically and theoretically sound. We prove that N-2-FWL is no less powerful than 3-WL, and can encode many substructures while only requiring O(n(2)) space. Finally, we design its neural version named N-2-GNN and evaluate its performance on various tasks. N-2-GNN achieves record-breaking results on ZINC-Subset (0.059) outperforming previous SOTA results by 10.6%. Moreover, N-2-GNN achieves new SOTA results on the BREC dataset (71.8%) among all existing high-expressive GNN methods.
引用
收藏
页数:36
相关论文
共 50 条
  • [31] Profiling the Design Space for Graph Neural Networks based Collaborative Filtering
    Wang, Zhenyi
    Zhao, Huan
    Shi, Chuan
    WSDM'22: PROCEEDINGS OF THE FIFTEENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2022, : 1109 - 1119
  • [32] Rethinking Causal Relationships Learning in Graph Neural Networks
    Gao, Hang
    Yao, Chengyu
    Li, Jiangmeng
    Si, Lingyu
    Jin, Yifan
    Wu, Fengge
    Zheng, Changwen
    Liu, Huaping
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 11, 2024, : 12145 - 12154
  • [33] Exponentially Improving the Complexity of Simulating theWeisfeiler-Lehman Test with Graph Neural Networks
    Aamand, Anders
    Chen, Justin Y.
    Indyk, Piotr
    Narayanan, Shyam
    Rubinfeld, Ronitt
    Schiefer, Nicholas
    Silwal, Sandeep
    Wagner, Tal
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [34] Graph Neural Networks for High-Level Synthesis Design Space Exploration
    Ferretti, Lorenzo
    Cini, Andrea
    Zacharopoulos, Georgios
    Alippi, Cesare
    Pozzi, Laura
    ACM TRANSACTIONS ON DESIGN AUTOMATION OF ELECTRONIC SYSTEMS, 2023, 28 (02)
  • [35] Rethinking the robustness of graph neural networks: An information theory perspective
    Li, Ding
    Xia, Hui
    Li, Xin
    Zhang, Rui
    Ma, Mingda
    KNOWLEDGE-BASED SYSTEMS, 2025, 314
  • [36] Edge-Level Explanations for Graph Neural Networks by Extending Explainability Methods for Convolutional Neural Networks
    Kasanishi, Tetsu
    Wang, Xueting
    Yamasaki, Toshihiko
    23RD IEEE INTERNATIONAL SYMPOSIUM ON MULTIMEDIA (ISM 2021), 2021, : 249 - 252
  • [37] Rethinking Higher-order Representation Learning with Graph Neural Networks
    Xu, Tuo
    Zou, Lei
    LEARNING ON GRAPHS CONFERENCE, VOL 231, 2023, 231
  • [38] Recurrent Space-time Graph Neural Networks
    Nicolicioiu, Andrei
    Duta, Lulia
    Leordeanu, Marius
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [39] Rethinking the Item Order in Session-based Recommendation with Graph Neural Networks
    Qiu, Ruihong
    Li, Jingjing
    Huang, Zi
    Yin, Hongzhi
    PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 579 - 588
  • [40] Circuit design completion using graph neural networks
    Said, Anwar
    Shabbir, Mudassir
    Broll, Brian
    Abbas, Waseem
    Voelgyesi, Peter
    Koutsoukos, Xenofon
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (16): : 12145 - 12157