GR-GNN: Gated Recursion-Based Graph Neural Network Algorithm

被引:3
作者
Ge, Kao [1 ]
Zhao, Jian-Qiang [2 ]
Zhao, Yan-Yong [3 ]
机构
[1] Chinese Acad Sci, Nanjing Inst Software Technol, Inst Software, Nanjing 211135, Peoples R China
[2] Xuzhou Univ Technol, Sch Math & Stat, Xuzhou 221018, Jiangsu, Peoples R China
[3] Nanjing Audit Univ, Dept Stat, Nanjing 211815, Peoples R China
关键词
GR-GNN; graph neural network; bias random walks; GRU;
D O I
10.3390/math10071171
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Under an internet background involving artificial intelligence and big data-unstructured, materialized, network graph-structured data, such as social networks, knowledge graphs, and compound molecules, have gradually entered into various specific business scenarios. One problem that urgently needs to be solved in the industry involves how to perform feature extractions, transformations, and operations in graph-structured data to solve downstream tasks, such as node classifications and graph classifications in actual business scenarios. Therefore, this paper proposes a gated recursion-based graph neural network (GR-GNN) algorithm to solve tasks such as node depth-dependent feature extractions and node classifications for graph-structured data. The GRU neural network unit was used to complete the node classification task and, thereby, construct the GR-GNN model. In order to verify the accuracy, effectiveness, and superiority of the algorithm on the open datasets Cora, CiteseerX, and PubMed, the algorithm was used to compare the operation results with the classical graph neural network baseline algorithms GCN, GAT, and GraphSAGE, respectively. The experimental results show that, on the validation set, the accuracy and target loss of the GR-GNN algorithm are better than or equal to other baseline algorithms; in terms of algorithm convergence speed, the performance of the GR-GNN algorithm is comparable to that of the GCN algorithm, which is higher than other algorithms. The research results show that the GR-GNN algorithm proposed in this paper has high accuracy and computational efficiency, and very wide application significance.
引用
收藏
页数:13
相关论文
共 25 条
[1]   Learning graphical models for stationary time series [J].
Bach, FR ;
Jordan, MI .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2004, 52 (08) :2189-2199
[2]  
Bruna J., 2013, SPECTRAL NETWORKS LO, DOI DOI 10.48550/ARXIV.1312.6203
[3]  
Defferrard M, 2016, ADV NEUR IN, V29
[4]  
Duvenaudt D, 2015, ADV NEUR IN, V28
[5]   Graph representation learning for road type classification [J].
Gharaee, Zahra ;
Kowshik, Shreyas ;
Stromann, Oliver ;
Felsberg, Michael .
PATTERN RECOGNITION, 2021, 120
[6]  
Gori M, 2005, IEEE IJCNN, P729
[7]   node2vec: Scalable Feature Learning for Networks [J].
Grover, Aditya ;
Leskovec, Jure .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :855-864
[8]   Self-supervised graph convolutional clustering by preserving latent distribution [J].
Kou, Shiwen ;
Xia, Wei ;
Zhang, Xiangdong ;
Gao, Quanxue ;
Gao, Xinbo .
NEUROCOMPUTING, 2021, 437 :218-226
[9]   CayleyNets: Graph Convolutional Neural Networks With Complex Rational Spectral Filters [J].
Levie, Ron ;
Monti, Federico ;
Bresson, Xavier ;
Bronstein, Michael M. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2019, 67 (01) :97-109
[10]  
Li C., 2018, P 32 INT C NEUR INF, P6072