TBDRI: block decomposition based on relational interaction for temporal knowledge graph completion

被引:12
作者
Yu, Mei [1 ,2 ,3 ]
Guo, Jiujiang [1 ,2 ,3 ]
Yu, Jian [1 ,2 ,3 ]
Xu, Tianyi [1 ,2 ,3 ]
Zhao, Mankun [1 ,2 ,3 ]
Liu, Hongwei [4 ]
Li, Xuewei [1 ,2 ,3 ]
Yu, Ruiguo [1 ,2 ,3 ]
机构
[1] Tianjin Univ, Coll Intelligence & Comp, Tianjin, Peoples R China
[2] Tianjin Key Lab Adv Networking TANKLab, Tianjin, Peoples R China
[3] Tianjin Key Lab Cognit Comp & Applicat, Tianjin, Peoples R China
[4] Tianjin Foreign Studies Univ, Foreign Language Literature & Culture Studies Ctr, Tianjin, Peoples R China
基金
中国国家自然科学基金;
关键词
Temporal KGC; Inverse relation; Tensor decomposition; TENSOR;
D O I
10.1007/s10489-022-03601-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge graph completion (KGC) can be interpreted as the task of missing inferences to real-world facts. Despite the importance and abundance of temporal knowledge graphs, most of the current research has been focused on reasoning on static knowledge graphs. The data they are applied to usually evolves with time, such as friend graphs in social networks. Therefore, developing temporal knowledge graph completion (temporal KGC) models is an increasingly important topic, although it is difficult due to data non-stationarity, and its complex temporal dependencies. In this paper, we propose block decomposition based on relational interaction for temporal knowledge graph completion (TBDRI), a novel model based on block term decomposition (which can be seen as a special variant of CP decomposition and Tucker decomposition) of the binary tensor representation of knowledge graph quadruples. TBDRI considers that inverse relations, as one of the most important types of relations, occupy an important share in the real world. Although some existing models introduce inverse relation into the model, it is not enough to only learn the inverse relation independently. TBDRI learns inverse relation in an enhanced way to strengthen the binding of forward and inverse relation. Furthermore, TBDRI first uses the core tensor as temporal information to bind timestamps more adequately. We prove TBDRI is full expressiveness and derive the bound on its entity, relation, and timestamp embedding dimensionality. We show that TBDRI is able to outperform most previous state-of-the-art models on the four benchmark datasets for temporal knowledge graph completion.
引用
收藏
页码:5072 / 5084
页数:13
相关论文
共 47 条
  • [1] [Anonymous], 2018, P 2018 C EMP METH NA
  • [2] DBpedia: A nucleus for a web of open data
    Auer, Soeren
    Bizer, Christian
    Kobilarov, Georgi
    Lehmann, Jens
    Cyganiak, Richard
    Ives, Zachary
    [J]. SEMANTIC WEB, PROCEEDINGS, 2007, 4825 : 722 - +
  • [3] Bader BW, 2007, IEEE DATA MINING, P33, DOI 10.1109/ICDM.2007.54
  • [4] Balazevic I, 2019, 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019), P5185
  • [5] Banko M., 2007, Proceedings of the 20th International Joint Conference on Artifical Intelligence, IJCAI'07, P2670, DOI DOI 10.1145/1409360.1409378
  • [6] Blog G, 2012, Introducing the knowledge graph: thing, not strings
  • [7] Bollacker Kurt, 2008, Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data, DOI [DOI 10.1145/1376616.1376746, 10.1145/1376616.1376746]
  • [8] Bordes Antoine, 2013, PROCADV NEURAL INF P, V26
  • [9] Boschee E., 2015, Harvard Dataverse, V12
  • [10] Burgun A., 2001, P N AM ASS COMPUTATI, P77