MECCH: Metapath Context Convolution-based Heterogeneous Graph Neural Networks

被引:10
|
作者
Fu, Xinyu [1 ]
King, Irwin [1 ]
机构
[1] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Hong Kong, Peoples R China
关键词
Graph neural networks; Heterogeneous information networks; Graph representation learning;
D O I
10.1016/j.neunet.2023.11.030
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Heterogeneous graph neural networks (HGNNs) were proposed for representation learning on structural data with multiple types of nodes and edges. To deal with the performance degradation issue when HGNNs become deep, researchers combine metapaths into HGNNs to associate nodes closely related in semantics but far apart in the graph. However, existing metapath-based models suffer from either information loss or high computation costs. To address these problems, we present a novel Metapath Context Convolution-based Heterogeneous Graph Neural Network (MECCH). MECCH leverages metapath contexts, a new kind of graph structure that facilitates lossless node information aggregation while avoiding any redundancy. Specifically, MECCH applies three novel components after feature preprocessing to extract comprehensive information from the input graph efficiently: (1) metapath context construction, (2) metapath context encoder, and (3) convolutional metapath fusion. Experiments on five real-world heterogeneous graph datasets for node classification and link prediction show that MECCH achieves superior prediction accuracy compared with state-of-the-art baselines with improved computational efficiency. The code is available at https://github.com/cynricfu/MECCH.
引用
收藏
页码:266 / 275
页数:10
相关论文
共 50 条
  • [1] A unified framework for convolution-based graph neural networks
    Pan, Xuran
    Han, Xiaoyan
    Wang, Chaofei
    Li, Zhuo
    Song, Shiji
    Huang, Gao
    Wu, Cheng
    PATTERN RECOGNITION, 2024, 155
  • [2] A graph convolution-based heterogeneous fusion network for multimodal sentiment analysis
    Zhao, Tong
    Peng, Junjie
    Huang, Yansong
    Wang, Lan
    Zhang, Huiran
    Cai, Zesu
    APPLIED INTELLIGENCE, 2023, 53 (24) : 30469 - 30481
  • [3] A graph convolution-based heterogeneous fusion network for multimodal sentiment analysis
    Tong Zhao
    Junjie Peng
    Yansong Huang
    Lan Wang
    Huiran Zhang
    Zesu Cai
    Applied Intelligence, 2023, 53 : 30455 - 30468
  • [4] HMSG: Heterogeneous graph neural network based on Metapath SubGraph learning
    Guan, Mengya
    Cai, Xinjun
    Shang, Jiaxing
    Hao, Fei
    Liu, Dajiang
    Jiao, Xianlong
    Ni, Wancheng
    KNOWLEDGE-BASED SYSTEMS, 2023, 279
  • [5] MAGNN: Metapath Aggregated Graph Neural Network for Heterogeneous Graph Embedding
    Fu, Xinyu
    Zhang, Jiani
    Men, Ziqiao
    King, Irwin
    WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, : 2331 - 2341
  • [6] Accuracy Evaluation of Transposed Convolution-Based Quantized Neural Networks
    Sestito, Cristian
    Perri, Stefania
    Stewart, Robert
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [7] Metapath Aggregated Graph Neural Network and Tripartite Heterogeneous Networks for Microbe-Disease Prediction
    Chen, Yali
    Lei, Xiujuan
    FRONTIERS IN MICROBIOLOGY, 2022, 13
  • [8] Metapath-guided Heterogeneous Graph Neural Network for Intent Recommendation
    Fan, Shaohua
    Zhu, Junxiong
    Han, Xiaotian
    Shi, Chuan
    Hu, Linmei
    Ma, Biyu
    Li, Yongliang
    KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, : 2478 - 2486
  • [9] ResConvE: Deeper Convolution-Based Knowledge Graph Embeddings
    Long, Yongxu
    Qiu, Zihan
    Zheng, Dongyang
    Wu, Zhengyang
    Li, Jianguo
    Tang, Yong
    COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2021, PT II, 2022, 1492 : 162 - 172
  • [10] Graph Convolution-Based Deep Clustering for Speech Separation
    Qin, Shan
    Jiang, Ting
    Wu, Sheng
    Wang, Ning
    Zhao, Xinran
    IEEE ACCESS, 2020, 8 : 82571 - 82580