GTAT: empowering graph neural networks with cross attention

被引:0
作者
Shen, Jiahao [1 ]
Ain, Qura Tul [1 ]
Liu, Yaohua [1 ]
Liang, Banqing [1 ]
Qiang, Xiaoli [2 ]
Kou, Zheng [1 ]
机构
[1] Guangzhou Univ, Inst Comp Sci & Technol, Guangzhou 510006, Peoples R China
[2] Guangzhou Univ, Sch Comp Sci & Cyber Engn, Guangzhou 510006, Peoples R China
来源
SCIENTIFIC REPORTS | 2025年 / 15卷 / 01期
基金
中国国家自然科学基金;
关键词
Graph learning; Graph neural networks; Network topology; Cross attention mechanism;
D O I
10.1038/s41598-025-88993-3
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Graph Neural Networks (GNNs) serve as a powerful framework for representation learning on graph-structured data, capturing the information of nodes by recursively aggregating and transforming the neighboring nodes' representations. Topology in graph plays an important role in learning graph representations and impacts the performance of GNNs. However, current methods fail to adequately integrate topological information into graph representation learning. To better leverage topological information and enhance representation capabilities, we propose the Graph Topology Attention Networks (GTAT). Specifically, GTAT first extracts topology features from the graph's structure and encodes them into topology representations. Then, the representations of node and topology are fed into cross attention GNN layers for interaction. This integration allows the model to dynamically adjust the influence of node features and topological information, thus improving the expressiveness of nodes. Experimental results on various graph benchmark datasets demonstrate GTAT outperforms recent state-of-the-art methods. Further analysis reveals GTAT's capability to mitigate the over-smoothing issue, and its increased robustness against noisy data.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Graph Attention Networks for Neural Social Recommendation
    Mu, Nan
    Zha, Daren
    He, Yuanye
    Tang, Zhihao
    2019 IEEE 31ST INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2019), 2019, : 1320 - 1327
  • [2] Bi-Level Attention Graph Neural Networks
    Iyer, Roshni G.
    Wang, Wei
    Sun, Yizhou
    2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021), 2021, : 1126 - 1131
  • [3] Attention-based graph neural networks: a survey
    Chengcheng Sun
    Chenhao Li
    Xiang Lin
    Tianji Zheng
    Fanrong Meng
    Xiaobin Rui
    Zhixiao Wang
    Artificial Intelligence Review, 2023, 56 : 2263 - 2310
  • [4] GRAPH ATTENTION NEURAL NETWORKS FOR POINT CLOUD RECOGNITION
    Li, Zongmin
    Zhang, Jun
    Li, Guanlin
    Liu, Yujie
    Li, Siyuan
    2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2019, : 387 - 392
  • [5] Supervised Attention Using Homophily in Graph Neural Networks
    Chatzianastasis, Michail
    Nikolentzos, Giannis
    Vazirgiannis, Michalis
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT IV, 2023, 14257 : 576 - 586
  • [6] Attention-based graph neural networks: a survey
    Sun, Chengcheng
    Li, Chenhao
    Lin, Xiang
    Zheng, Tianji
    Meng, Fanrong
    Rui, Xiaobin
    Wang, Zhixiao
    ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (SUPPL 2) : 2263 - 2310
  • [7] Graph neural networks with multiple kernel ensemble attention
    Zhang, Haimin
    Xu, Min
    KNOWLEDGE-BASED SYSTEMS, 2021, 229
  • [8] CoRGi: Content-Rich Graph Neural Networks with Attention
    Kim, Jooyeon
    Lamb, Angus
    Woodhead, Simon
    Jones, Simon Peyton
    Zhang, Cheng
    Allamanis, Miltiadis
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 773 - 783
  • [9] Seizure localisation with attention-based graph neural networks
    Grattarola, Daniele
    Livi, Lorenzo
    Alippi, Cesare
    Wennberg, Richard
    Valiante, Taufik A.
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 203
  • [10] Stochastic Graph Neural Networks
    Gao, Zhan
    Isufi, Elvin
    Ribeiro, Alejandro
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 4428 - 4443