Time-Aware Graph Learning for Link Prediction on Temporal Networks

被引:0
作者
Pan, Zhiqiang [1 ]
Chen, Honghui [1 ]
Chen, Wanyu [2 ]
Cai, Fei [1 ]
Liu, Xinwang [3 ]
机构
[1] Natl Univ Def Technol, Sci & Technol Informat Syst Engn Lab, Changsha 410073, Peoples R China
[2] Natl Univ Def Technol, Coll Elect Countermeasures, Hefei 230037, Peoples R China
[3] Natl Univ Def Technol, Sch Comp, Changsha 410073, Peoples R China
基金
中国国家自然科学基金;
关键词
Correlation; Representation learning; Contrastive learning; Learning systems; Encoding; Computational modeling; Recurrent neural networks; Predictive models; Data models; Training; Causal colliding; contrastive learning; edge dropping (ED); link prediction; recent sampling; temporal networks;
D O I
10.1109/TNNLS.2025.3545021
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Link prediction on temporal networks aims to predict the future edges by modeling the dynamic evolution involved in the graph data. Previous methods relying on the node/edge attributes or the distance on the graph structure are not practical due to the deficiency of the attributes and the limitation of the explicit distance estimation, respectively. Moreover, the existing graph representation learning methods mostly rely on graph neural networks (GNNs), which cannot adequately take the dynamic correlations between nodes into consideration, leading to the generating of inferior node embeddings. Thus, we propose a time-aware graph (TAG) learning method for link prediction on temporal networks. We first conduct a theoretical causal analysis proving that the correlations between nodes are required to be unchanged for the temporal graph representation learning using GNNs. Then, we model the recent dynamic node correlations by designing an edge-dropping (ED) module and adopting a recent neighbor sampling (RNS) strategy so as to approximate the above condition. Besides, we also preserve the long-term stable node correlations by introducing additional self-supervisions using the contrastive learning. Comprehensive experiments were conducted on four public temporal network datasets, i.e., MathOverflow, StackOverflow, AskUbuntu, and SuperUser, demonstrate that TAG can achieve state-of-the-art performance in terms of average precision (AP) and area under the ROC curve (AUC). In addition, TAG can ensure high computational efficiency by making the temporal graph lightweight, letting it be practical in real-world applications.
引用
收藏
页码:10619 / 10634
页数:16
相关论文
共 67 条
[1]  
[Anonymous], 2016, P INT C LEARN REPR I, P1
[2]   A Survey on Embedding Dynamic Graphs [J].
Barros, Claudio D. T. ;
Mendonca, Matheus R. F. ;
Vieira, Alex B. ;
Ziviani, Artur .
ACM COMPUTING SURVEYS, 2023, 55 (01)
[3]   Machine Unlearning [J].
Bourtoule, Lucas ;
Chandrasekaran, Varun ;
Choquette-Choo, Christopher A. ;
Jia, Hengrui ;
Travers, Adelin ;
Zhang, Baiwu ;
Lie, David ;
Papernot, Nicolas .
2021 IEEE SYMPOSIUM ON SECURITY AND PRIVACY, SP, 2021, :141-159
[4]   Graph Unlearning [J].
Chen, Min ;
Zhang, Zhikun ;
Wang, Tianhao ;
Backes, Michael ;
Humbert, Mathias ;
Zhang, Yang .
PROCEEDINGS OF THE 2022 ACM SIGSAC CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY, CCS 2022, 2022, :499-513
[5]   Neural-Symbolic Methods for Knowledge Graph Reasoning: A Survey [J].
Cheng, Kewei ;
Ahmed, Nesreen k. ;
Rossi, Ryan a. ;
Willke, Theodore ;
Sun, Yizhou .
ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2024, 18 (09)
[6]  
Cong W., 2023, INT C LEARN REPR, P1
[7]  
Ding S., 2021, ARXIV
[8]   A Novel Representation Learning for Dynamic Graphs Based on Graph Convolutional Networks [J].
Gao, Chao ;
Zhu, Junyou ;
Zhang, Fan ;
Wang, Zhen ;
Li, Xuelong .
IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (06) :3599-3612
[9]   Exploring Temporal Information for Dynamic Network Embedding [J].
Gong, Maoguo ;
Ji, Shunfei ;
Xie, Yu ;
Gao, Yuan ;
Qin, A. K. .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (08) :3754-3764
[10]  
Hajiramezanali Ehsan, 2019, Advances in Neural Information Processing Systems, V32