Improving static and temporal knowledge graph embedding using affine transformations of entities

被引:1
作者
Yang, Jinfa [1 ]
Ying, Xianghua [1 ]
Shi, Yongjie [1 ]
Wang, Ruibin [1 ]
机构
[1] Peking Univ, Sch Intelligence Sci & Technol, Natl Key Lab Gen Artificial Intelligence, Beijing, Peoples R China
来源
JOURNAL OF WEB SEMANTICS | 2024年 / 82卷
关键词
Knowledge graph; Static knowledge graph embedding; Temporal knowledge graph embedding; Link prediction;
D O I
10.1016/j.websem.2024.100824
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
To find a suitable embedding for a knowledge graph (KG) remains a big challenge nowadays. By measuring the distance or plausibility of triples and quadruples in static and temporal knowledge graphs, many reliable knowledge graph embedding (KGE) models are proposed. However, these classical models may not be able to represent and infer various relation patterns well, such as TransE cannot represent symmetric relations, DistMult cannot represent inverse relations, RotatE cannot represent multiple relations, etc .. In this paper, we improve the ability of these models to represent various relation patterns by introducing the affine transformation framework. Specifically, we first utilize a set of affine transformations related to each relation or timestamp to operate on entity vectors, and then these transformed vectors can be applied not only to static KGE models, but also to temporal KGE models. The main advantage of using affine transformations is their good geometry properties with interpretability. Our experimental results demonstrate that the proposed intuitive design with affine transformations provides a statistically significant increase in performance with adding a few extra processing steps and keeping the same number of embedding parameters. Taking TransE as an example, we employ the scale transformation (the special case of an affine transformation). Surprisingly, it even outperforms RotatE to some extent on various datasets. We also introduce affine transformations into RotatE, Distmult, ComplEx, TTransE and TComplEx respectively, and experiments demonstrate that affine transformations consistently and significantly improve the performance of state-of-the-art KGE models on both static and temporal knowledge graph benchmarks.
引用
收藏
页数:11
相关论文
共 49 条
[1]  
[Anonymous], 2008, P 2008 ACM SIGMOD IN
[2]  
[Anonymous], 2015, P 3 WORKSH CONT VECT, DOI [DOI 10.18653/V1/W15-4007, 10.18653/v1/W15-4007]
[3]  
Balazevic I, 2019, ADV NEUR IN, V32
[4]  
Bordes A., 2013, P ANN C NEUR INF PRO
[5]  
Cao ZS, 2021, AAAI CONF ARTIF INTE, V35, P6894
[6]  
Chen K, 2022, PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), P5843
[7]  
Nguyen DQ, 2020, 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), P3429
[8]  
Dasgupta SS, 2018, 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), P2001
[9]  
Dettmers T, 2018, AAAI CONF ARTIF INTE, P1811
[10]  
Duchi J, 2011, J MACH LEARN RES, V12, P2121