High-Order Pooling for Graph Neural Networks with Tensor Decomposition

被引:0
|
作者
Hua, Chenqing [1 ,4 ]
Rabusseau, Guillaume [2 ,4 ,5 ,6 ]
Tang, Jian [3 ,4 ,6 ]
机构
[1] McGill Univ, Montreal, PQ, Canada
[2] Univ Montreal, Montreal, PQ, Canada
[3] HEC Montreal, Montreal, PQ, Canada
[4] Mila, Montreal, PQ, Canada
[5] DIRO, New York, NY USA
[6] CIFAR AI Chair, Edmonton, AB, Canada
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022) | 2022年
基金
加拿大自然科学与工程研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Neural Networks (GNNs) are attracting growing attention due to their effectiveness and flexibility in modeling a variety of graph-structured data. Exiting GNN architectures usually adopt simple pooling operations (e.g., sum, average, max) when aggregating messages from a local neighborhood for updating node representation or pooling node representations from the entire graph to compute the graph representation. Though simple and effective, these linear operations do not model high-order non-linear interactions among nodes. We propose the Tensorized Graph Neural Network (tGNN), a highly expressive GNN architecture relying on tensor decomposition to model high-order non-linear node interactions. tGNN leverages the symmetric CP decomposition to efficiently parameterize permutation-invariant multilinear maps for modeling node interactions. Theoretical and empirical analysis on both node and graph classification tasks show the superiority of tGNN over competitive baselines. In particular, tGNN achieves the most solid results on two OGB node classification datasets and one OGB graph classification dataset.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Adaptive tensor networks decomposition for high-order tensor recovery and compression
    Nie, Chang
    Wang, Huan
    Zhao, Lu
    INFORMATION SCIENCES, 2023, 629 : 667 - 684
  • [2] A neural tensor decomposition model for high-order sparse data recovery
    Liao, Tianchi
    Yang, Jinghua
    Chen, Chuan
    Zheng, Zibin
    INFORMATION SCIENCES, 2024, 658
  • [3] Graph Neural Networks With High-Order Polynomial Spectral Filters
    Zeng, Zeyuan
    Peng, Qinke
    Mou, Xu
    Wang, Ying
    Li, Ruimeng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 12590 - 12603
  • [4] Second-Order Pooling for Graph Neural Networks
    Wang, Zhengyang
    Ji, Shuiwang
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (06) : 6870 - 6880
  • [5] Subgraph Pattern Neural Networks for High-Order Graph Evolution Prediction
    Meng, Changping
    Mouli, S. Chandra
    Ribeiro, Bruno
    Neville, Jennifer
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 3778 - 3787
  • [6] Higher-order Clustering and Pooling for Graph Neural Networks
    Duval, Alexandre
    Malliaros, Fragkiskos
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 426 - 435
  • [7] Neural Pooling for Graph Neural Networks
    Harsha, Sai Sree
    Mishra, Deepak
    PATTERN RECOGNITION AND MACHINE INTELLIGENCE, PREMI 2021, 2024, 13102 : 171 - 180
  • [8] Tensor-based Polynomial Features Generation for High-order Neural Networks
    Oswald, Cyril
    Peichl, Adam
    Vyhlidal, Tomas
    PROCESS CONTROL '21 - PROCEEDING OF THE 2021 23RD INTERNATIONAL CONFERENCE ON PROCESS CONTROL (PC), 2021, : 175 - 179
  • [9] High-order Hopfield neural networks
    Shen, Y
    Zong, XJ
    Jiang, MH
    ADVANCES IN NEURAL NETWORKS - ISNN 2005, PT 1, PROCEEDINGS, 2005, 3496 : 235 - 240