μKG: A Library for Multi-source Knowledge Graph Embeddings and Applications

被引:3
作者
Luo, Xindi [1 ]
Sun, Zequn [1 ]
Hu, Andwei [1 ,2 ]
机构
[1] Nanjing Univ, State Key Lab Novel Software Technol, Nanjing, Peoples R China
[2] Nanjing Univ, Natl Inst Healthcare Data Sci, Nanjing, Peoples R China
来源
SEMANTIC WEB - ISWC 2022 | 2022年 / 13489卷
基金
中国国家自然科学基金;
关键词
Multi-source knowledge graphs; Representation learning; Link prediction; Entity alignment; Entity typing;
D O I
10.1007/978-3-031-19433-7_35
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents mu KG, an open-source Python library for representation learning over knowledge graphs. mu KG supports joint representation learning over multi-source knowledge graphs (and also a single knowledge graph), multiple deep learning libraries (PyTorch and TensorFlow2), multiple embedding tasks (link prediction, entity alignment, entity typing, and multi-source link prediction), and multiple parallel computing modes (multi-process and multi-GPU computing). It currently implements 26 popular knowledge graph embedding models and supports 16 benchmark datasets. mu KG provides advanced implementations of embedding techniques with simplified pipelines of different tasks. It also comes with high-quality documentation for ease of use. mu KG is more comprehensive than existing knowledge graph embedding libraries. It is useful for a thorough comparison and analysis of various embedding models and tasks. We show that the jointly learned embeddings can greatly help knowledge-powered downstream tasks, such as multi-hop knowledge graph question answering. We will stay abreast of the latest developments in the related fields and incorporate them into mu KG.
引用
收藏
页码:610 / 627
页数:18
相关论文
共 58 条
[1]  
Ali M, 2021, J MACH LEARN RES, V22
[2]   The KEEN Universe An Ecosystem for Knowledge Graph Embeddings with a Focus on Reproducibility and Transferability [J].
Ali, Mehdi ;
Jabeen, Hajira ;
Hoyt, Charles Tapley ;
Lehmann, Jens .
SEMANTIC WEB - ISWC 2019, PT II, 2019, 11779 :3-18
[3]  
[Anonymous], Proceedings of the 2008 ACM SIGMOD international conference on Management of data, SIGMOD '08
[4]  
Balazevic I, 2019, 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019), P5185
[5]  
Bordes A., 2013, ADV NEURAL INFORM PR, P2787
[6]  
Boschin A., 2020, ABS200902963 CORR
[7]  
Broscheit S, 2020, PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING: SYSTEM DEMONSTRATIONS, P165
[8]  
Cai L., 2018, P 2018 C N AM CHAPT, DOI DOI 10.18653/V1/N18-1133
[9]  
Chen MH, 2018, PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P3998
[10]  
Chen MH, 2017, PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P1511