Temporal Network Embedding for Link Prediction via VAE Joint Attention Mechanism

被引:49
作者
Jiao, Pengfei [1 ]
Guo, Xuan [2 ]
Jing, Xin [2 ]
He, Dongxiao [2 ]
Wu, Huaming [3 ]
Pan, Shirui [4 ]
Gong, Maoguo [5 ]
Wang, Wenjun [2 ]
机构
[1] Tianjin Univ, Ctr Biosafety Res & Strategy, Law Sch, Tianjin 300350, Peoples R China
[2] Tianjin Univ, Coll Intelligence & Comp, Tianjin 300350, Peoples R China
[3] Tianjin Univ, Ctr Appl Math, Tianjin 300072, Peoples R China
[4] Monash Univ, Fac Informat Technol, Dept Data Sci & AI, Clayton, Vic 3800, Australia
[5] Xidian Univ, Int Res Ctr Intelligent Percept & Computat, Key Lab Intelligent Percept & Image Understanding, Minist Educ, Xian 710071, Peoples R China
基金
中国国家自然科学基金;
关键词
Network topology; Task analysis; Predictive models; Topology; Logic gates; Social networking (online); Matrix decomposition; Link prediction; self-attention mechanism; temporal network embedding (TNE); variational autoencoder (VAE); VISUALIZATION;
D O I
10.1109/TNNLS.2021.3084957
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Network representation learning or embedding aims to project the network into a low-dimensional space that can be devoted to different network tasks. Temporal networks are an important type of network whose topological structure changes over time. Compared with methods on static networks, temporal network embedding (TNE) methods are facing three challenges: 1) it cannot describe the temporal dependence across network snapshots; 2) the node embedding in the latent space fails to indicate changes in the network topology; and 3) it cannot avoid a lot of redundant computation via parameter inheritance on a series of snapshots. To overcome these problems, we propose a novel TNE method named temporal network embedding method based on the VAE framework (TVAE), which is based on a variational autoencoder (VAE) to capture the evolution of temporal networks for link prediction. It not only generates low-dimensional embedding vectors for nodes but also preserves the dynamic nonlinear features of temporal networks. Through the combination of a self-attention mechanism and recurrent neural networks, TVAE can update node representations and keep the temporal dependence of vectors over time. We utilize parameter inheritance to keep the new embedding close to the previous one, rather than explicitly using regularization, and thus, it is effective for large-scale networks. We evaluate our model and several baselines on synthetic data sets and real-world networks. The experimental results demonstrate that TVAE has superior performance and lower time cost compared with the baselines.
引用
收藏
页码:7400 / 7413
页数:14
相关论文
共 61 条
[1]  
Aggarwal C.C., 2011, SDM, P355
[2]   Who to Follow and Why: Link Prediction with Explanations [J].
Barbieri, Nicola ;
Bonchi, Francesco ;
Manco, Giuseppe .
PROCEEDINGS OF THE 20TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING (KDD'14), 2014, :1266-1275
[3]  
Cao S, 2015, P 24 ACM INT C INF K, P891, DOI DOI 10.1145/2806416.2806512
[4]   Embedding Both Finite and Infinite Communities on Graphs [J].
Cavallari, Sandro ;
Cambria, Erik ;
Cai, Hongyun ;
Chang, Kevin Chen-Chuan ;
Zheng, Vincent W. .
IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2019, 14 (03) :39-50
[5]   Continuous-Time Dynamic Graph Learning via Neural Interaction Processes [J].
Chang, Xiaofu ;
Liu, Xuqin ;
Wen, Jianfeng ;
Li, Shuang ;
Fang, Yanming ;
Song, Le ;
Qi, Yuan .
CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, :145-154
[6]  
Chen Jinyin, 2018, GC-LSTM: Graph convolution embedded LSTM for dynamic link prediction, DOI DOI 10.1007/S10489-021-02518-9
[7]   A Survey on Network Embedding [J].
Cui, Peng ;
Wang, Xiao ;
Pei, Jian ;
Zhu, Wenwu .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2019, 31 (05) :833-852
[8]   Temporal Attention-Augmented Bilinear Network for Financial Time-Series Data Analysis [J].
Dat Thanh Tran ;
Iosifidis, Alexandros ;
Kanniainen, Juho ;
Gabbouj, Moncef .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (05) :1407-1418
[9]  
Du L, 2018, PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P2086
[10]   Temporal Link Prediction Using Matrix and Tensor Factorizations [J].
Dunlavy, Daniel M. ;
Kolda, Tamara G. ;
Acar, Evrim .
ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2011, 5 (02)