Global and local information-aware relational graph convolutional network for temporal knowledge graph completion

被引:1
作者
Wang, Shuo [1 ]
Chen, Shuxu [1 ]
Zhong, Zhaoqian [1 ]
机构
[1] Dalian Univ, Key Lab Adv Design & Intelligent Comp, Minist Educ, Dalian 116622, Peoples R China
基金
中国国家自然科学基金;
关键词
Temporal knowledge graph; Link prediction; Representation learning; Graph neural network;
D O I
10.1007/s10489-024-05987-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Temporal knowledge graph completion (TKGC) focuses on inferring missing facts from temporal knowledge graphs (TKGs) and has been widely studied. While previous models based on graph neural networks (GNNs) have shown noteworthy outcomes, they tend to focus on designing complex modules to learn contextual representations. These complex solutions require a large number of parameters and heavy memory consumption. Additionally, existing TKGC approaches focus on exploiting static feature representation for entities and relationships, which fail to effectively capture the semantic information of contexts. In this paper, we propose a global and local information-aware relational graph convolutional neural network (GLARGCN) model to address these issues. First, we design a sampler, which captures significant neighbors by combining global historical event frequencies with local temporal relative displacements and requires no additional learnable parameters. We then employ a time-aware encoder to model timestamps, relations, and entities uniformly. We perform a graph convolution operation to learn a global graph representation. Finally, our method predicts missing entities using a scoring function. We evaluate the model on four benchmark datasets and one specific dataset with unseen timestamps. The experimental results demonstrate that our proposed GLARGCN model not only outperforms contemporary models but also shows robust performance in scenarios with unseen timestamps.
引用
收藏
页数:15
相关论文
共 41 条
[1]  
Bansal T, 2019, 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), P4387
[2]  
Bordes A, 2013, Adv Condens Matter Phys, V26
[3]  
Boschee Elizabeth, 2015, HarvardDataverse, V21
[4]   Event-QA: A Dataset for Event-Centric Question Answering over Knowledge Graphs [J].
Costa, Tarcisio Souza ;
Gottschalk, Simon ;
Demidova, Elena .
CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, :3157-3164
[5]  
Dasgupta SS, 2018, 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), P2001
[6]  
Dettmers T, 2018, AAAI CONF ARTIF INTE, P1811
[7]  
Dikeoulias I, 2022, PROCEEDINGS OF THE 7TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP, P111
[8]  
Ding Z., 2022, NEURIPS 2022 TEMP GR
[9]  
Erxleben F, 2014, LECT NOTES COMPUT SC, V8796, P50, DOI 10.1007/978-3-319-11964-9_4
[10]  
García-Durán A, 2018, 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), P4816