An enhanced residual learning framework for Graph Neural Networks based on Dual Random Walk

被引:0
作者
Fan, Jin [1 ,2 ]
Gu, Zhangyu [1 ]
Yang, Jiajun [1 ]
Wu, Huifeng [1 ,2 ]
Sun, Danfeng [1 ,2 ]
Wu, Jia [3 ]
机构
[1] Hangzhou Dianzi Univ, Dept Comp Sci & Technol, Hangzhou, Peoples R China
[2] Hangzhou Dianzi Univ, Zhejiang Prov Key Lab Ind Internet Discrete Ind, Hangzhou, Peoples R China
[3] Macquarie Univ, Dept Comp, Sydney, Australia
基金
中国国家自然科学基金;
关键词
Node classification; Graph Neural Networks; Random walk; Hybrid Residual Connections; Enhanced Residual Learning;
D O I
10.1016/j.knosys.2025.113822
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph Neural Networks (GNNs) have demonstrated outstanding achievements in node classification tasks. Although extensive research has been conducted, prevalent methods primarily focus on feature engineering while paying limited attention to structural information encoded in adjacency matrices. Most current approaches fail to effectively capture both global and local topological properties and often suffer from over-smoothing during information propagation. In this work, we propose an Enhanced Residual Learning Framework for Graph Neural Networks based on Dual Random Walk (ResDW-GNN). This framework employs a dual random walk strategy, combining breadth-first search (BFS) and depth-first search (DFS), where DFS walks preserve key heterogeneity and global connectivity features, BFS walks preserve homogeneity and local structural details. We design a Dual-walk Node Representation (DNR) mechanism that integrates both BFS and DFS walking strategies to capture different structural perspectives. Through this mechanism, we learn DNR-based node-level weights that effectively balance different structural aspects, allowing the model to adaptively focus on important features from both global and local structural patterns. Furthermore, to better integrate multi-scale information and address the over-smoothing problem in GNN, we propose two key architectural components: a Hybrid Residual Connections (HRC) mechanism and an Adaptive Iterative Graph Convolution (AIGC) module. Comprehensive evaluations on nine real-world benchmark datasets demonstrate that our proposed ResDW-GNN achieves state-of-the-art performance, consistently outperforming existing methods across both homophilous and heterophilous graph structures.
引用
收藏
页数:14
相关论文
共 66 条
[1]  
Abbas Muhammad Affan, 2024, Infinite Study
[2]  
Abu-El-Haifa S, 2019, PR MACH LEARN RES, V97
[3]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[4]   Geometric Deep Learning Going beyond Euclidean data [J].
Bronstein, Michael M. ;
Bruna, Joan ;
LeCun, Yann ;
Szlam, Arthur ;
Vandergheynst, Pierre .
IEEE SIGNAL PROCESSING MAGAZINE, 2017, 34 (04) :18-42
[5]  
Bruna J, 2014, Arxiv, DOI arXiv:1312.6203
[6]  
Chen JS, 2024, Arxiv, DOI arXiv:2406.19249
[7]  
Chen Ming, 2020, INT C MACHINE LEARNI, P1725
[8]  
Chen YH, 2023, Arxiv, DOI arXiv:2305.04225
[9]   Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks [J].
Chiang, Wei-Lin ;
Liu, Xuanqing ;
Si, Si ;
Li, Yang ;
Bengio, Samy ;
Hsieh, Cho-Jui .
KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, :257-266
[10]  
Chien EL, 2021, Arxiv, DOI arXiv:2006.07988