EHG: efficient heterogeneous graph transformer for multiclass node classification

被引:0
作者
Wang, Man [1 ]
Liu, Shouqiang [1 ]
Deng, Zhen [2 ]
机构
[1] South China Normal Univ, Sch Artificial Intelligence, Foshan 528000, Peoples R China
[2] Southern Med Univ, Neurol Dept, Nanfang Hosp, Guangzhou 510000, Peoples R China
来源
ADVANCES IN CONTINUOUS AND DISCRETE MODELS | 2025年 / 2025卷 / 01期
关键词
Graph Neural Networks; Representation Learning; Graph Attention; Heterogeneous Graph; Graph Analysis; Graph Transformer; Knowledge Graph;
D O I
10.1186/s13662-025-03885-0
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Graph neural networks empowered by the Transformer's self-attention mechanism have arisen as a preferred solution for many graph classification and prediction tasks. Despite their efficacy, these networks are often hampered by their quadratic computational complexity and large model size, which pose significant challenges during graph training and inference. In this study, we present an innovative approach to heterogeneous graph transformation that adeptly navigates these limitations by capturing the rich diversity and semantic depth of graphs with various node and edge types. Our method, which streamlines the key-value interaction to a straightforward linear layer operation, maintains the same level of ranking accuracy while significantly reducing computational overhead and accelerating model training. We introduce the "EHG" model, a testament to our approach's efficacy, showcasing remarkable performance in multiclass node classification on heterogeneous graphs. Our model's evaluation on the DBLP, ACM, OGBN-MAG, and OAG datasets reveals its superiority over existing heterogeneous graph models under identical hyperparameter configurations. Notably, our model achieves a reduction of approximately 25% in parameter count and nearly 20% savings in training time compared to the leading heterogeneous graph-transformer models.
引用
收藏
页数:18
相关论文
共 27 条
[21]  
Veličkovic P, 2018, Arxiv, DOI [arXiv:1710.10903, 10.48550/arXiv.1710.10903, DOI 10.17863/CAM.48429]
[22]  
Wang SN, 2020, Arxiv, DOI [arXiv:2006.04768, 10.48550/arXiv.2006.04768, DOI 10.48550/ARXIV.2006.04768]
[23]   Heterogeneous Graph Attention Network [J].
Wang, Xiao ;
Ji, Houye ;
Shi, Chuan ;
Wang, Bai ;
Cui, Peng ;
Yu, P. ;
Ye, Yanfang .
WEB CONFERENCE 2019: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2019), 2019, :2022-2032
[24]  
Wu QT, 2023, Arxiv, DOI arXiv:2301.09474
[25]   Effective knowledge graph embeddings based on multidirectional semantics relations for polypharmacy side effects prediction [J].
Yao, Junfeng ;
Sun, Wen ;
Jian, Zhongquan ;
Wu, Qingqiang ;
Wang, Xiaoli .
BIOINFORMATICS, 2022, 38 (08) :2315-2322
[26]   Heterogeneous Graph Neural Network [J].
Zhang, Chuxu ;
Song, Dongjin ;
Huang, Chao ;
Swami, Ananthram ;
Chawla, Nitesh V. .
KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, :793-803
[27]   Deep Learning on Graphs: A Survey [J].
Zhang, Ziwei ;
Cui, Peng ;
Zhu, Wenwu .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (01) :249-270