GraphSense: a self-aware dynamic graph learning networks for graph data over internet

被引:0
作者
Li, Zhi-Yuan [1 ,2 ,3 ]
Zhou, Ying-Yi [1 ]
He, En-Han [1 ]
机构
[1] Jiangsu Univ, Sch Comp Sci & Commun Engn, Zhenjiang, Jiangsu, Peoples R China
[2] Jiangsu Ind Network Secur Technol Key Lab, Zhenjiang, Jiangsu, Peoples R China
[3] Jiangsu Prov Engn Res Ctr Ubiquitous Data Intellig, Zhenjiang, Jiangsu, Peoples R China
关键词
Graph neural network; Dynamic graph; Graph representation learning; Network structure learning;
D O I
10.1007/s10489-024-05882-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dynamic graph data learning is an important data analysis technique. In the age of big data, the volume of data produced daily is immense, the data types are varied, the value density is low, and the data continues to accumulate over time. These characteristics make data processing more challenging. In particular, unstructured data, unlike structured data, does not have a fixed format, and its volume is large and variable, which presents a significant challenge to traditional data processing techniques. Nowadays, researchers have been employing graph neural network models to analyze unstructured data. However, real-world graph structures are dynamic and time-varying, and the static graph neural network cannot effectively learn graph node embeddings and network structures. To address the challenges mentioned above, we propose a self-aware dynamic graph network structure learning model, called GraphSense. The algorithm consists of two modules: self-sensing neighborhood aggregation algorithm and dynamic graph structure learning algorithm based on RNN. GraphSense can make each node discover more valuable neighbors through the self-aware neighborhood aggregation algorithm in each epoch. The algorithm employs gated recurrent unit to dynamically aggregate the information of node neighbors to learn the high-order information. Next, in order to capture the temporal properties of graph structures, we employ dynamic graph structure learning algorithm based on RNN to replicate the time evolution process of dynamic graphs. Finally, we evaluate the performance of GraphSense on four publicly available datasets by two specific tasks(edge and node classification). The experimental results show that the proposed GraphSense model outperforms the baseline model by 2.0% to 25.0% on the Elliptic dataset, 2.5% to 27.0% on the Bitcoin-alpha dataset, 3.0% to 31.0% on the Bitcoin-otc dataset, and 0.9% to 26.0% on the Reddit dataset in terms of F1 scores. The results suggest that our model is effective in learning from dynamic graph data.
引用
收藏
页数:19
相关论文
共 48 条
[1]  
Bahdanau D, 2016, Arxiv, DOI arXiv:1409.0473
[2]  
Bonner S, 2019, IEEE INT CONF BIG DA, P5336, DOI 10.1109/BigData47090.2019.9005545
[3]  
Chen C, 2024, IEEE Transactions on Pattern Analysis and Machine Intelligence
[4]   GC-LSTM: graph convolution embedded LSTM for dynamic network link prediction [J].
Chen, Jinyin ;
Wang, Xueke ;
Xu, Xuanheng .
APPLIED INTELLIGENCE, 2022, 52 (07) :7513-7528
[5]   Universal Graph Transformer Self-Attention Networks [J].
Dai Quoc Nguyen ;
Tu Dinh Nguyen ;
Dinh Phung .
COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, :193-196
[6]  
Duan MJ, 2024, AAAI CONF ARTIF INTE, P11820
[7]   A Novel Representation Learning for Dynamic Graphs Based on Graph Convolutional Networks [J].
Gao, Chao ;
Zhu, Junyou ;
Zhang, Fan ;
Wang, Zhen ;
Li, Xuelong .
IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (06) :3599-3612
[8]   Large-Scale Learnable Graph Convolutional Networks [J].
Gao, Hongyang ;
Wang, Zhengyang ;
Ji, Shuiwang .
KDD'18: PROCEEDINGS OF THE 24TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2018, :1416-1424
[9]   node2vec: Scalable Feature Learning for Networks [J].
Grover, Aditya ;
Leskovec, Jure .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :855-864
[10]  
Hamilton WL, 2017, ADV NEUR IN, V30