TDN: An Integrated Representation Learning Model of Knowledge Graphs

被引:4
|
作者
Kang, Xiaojun [1 ,2 ]
Yao, Hong [1 ,2 ]
Li, Qingtao [1 ]
Li, Xinchuan [1 ,2 ]
Liu, Chao [1 ,2 ]
Dong, Lijun [1 ,2 ]
机构
[1] China Univ Geosci, Sch Comp Sci, Wuhan 430074, Hubei, Peoples R China
[2] China Univ Geosci, Hubei Key Lab Intelligent Geoinformat Proc, Wuhan 430074, Hubei, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Artificial intelligence; embedding; knowledge graph; knowledge representation;
D O I
10.1109/ACCESS.2019.2913086
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Knowledge graph (KG) is playing an important role in many artificial intelligence applications. Representation learning of KGs aims to project both entities and relations into a continuous low-dimensional space. The representation learning technique based on embedding has been used to implement the KG completion, which aims to predict potential triples (head, relation, and tail) in KG. Most current methods concentrate on learning representations based on triple information while ignoring integrating the textual knowledge and network topology of KG. This leads to ambiguous completions. To address this problem and implement more accurate KG completion, we propose a new representation learning model, TDN model, which integratedly embeds the information of triples, text descriptions, and network structure of KG in a low-dimensional vector space. The framework of TDN is defined and the methodology of implementing TDN embedding is explored. To verify the effectiveness of the proposed model, we evaluate TDN via the experiments of link prediction on the real-world datasets. The experimental results confirm the above claims and show that TDN-based embedding significantly outperforms other baselines.
引用
收藏
页码:55199 / 55205
页数:7
相关论文
共 50 条
  • [1] Representation learning over multiple knowledge graphs for knowledge graphs alignment
    Liu, Wenqiang
    Liu, Jun
    Wu, Mengmeng
    Abbas, Samar
    Hu, Wei
    Wei, Bifan
    Zheng, Qinghua
    NEUROCOMPUTING, 2018, 320 : 12 - 24
  • [2] Representation Learning of Knowledge Graphs with Embedding Subspaces
    Li, Chunhua
    Xian, Xuefeng
    Ai, Xusheng
    Cui, Zhiming
    SCIENTIFIC PROGRAMMING, 2020, 2020
  • [3] Representation Learning of Knowledge Graphs with Entity Descriptions
    Xie, Ruobing
    Liu, Zhiyuan
    Jia, Jia
    Luan, Huanbo
    Sun, Maosong
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 2659 - 2665
  • [4] Representation Learning of Knowledge Graphs With Entity Attributes
    Zhang, Zhongwei
    Cao, Lei
    Chen, Xiliang
    Tang, Wei
    Xu, Zhixiong
    Meng, Yangyang
    IEEE ACCESS, 2020, 8 : 7435 - 7441
  • [5] Representation Learning with Entity Topics for Knowledge Graphs
    Ouyang, Xin
    Yang, Yan
    He, Liang
    Chen, Qin
    Zhang, Jiacheng
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT (KSEM 2017): 10TH INTERNATIONAL CONFERENCE, KSEM 2017, MELBOURNE, VIC, AUSTRALIA, AUGUST 19-20, 2017, PROCEEDINGS, 2017, 10412 : 534 - 542
  • [6] Representation Learning on Graphs with Jumping Knowledge Networks
    Xu, Keyulu
    Li, Chengtao
    Tian, Yonglong
    Sonobe, Tomohiro
    Kawarabayashi, Ken-ichi
    Jegelka, Stefanie
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [7] KNOWLEDGE REPRESENTATION MODEL WHICH COMBINES CONCEPTUAL GRAPHS AND FUZZINESS FOR MACHINE LEARNING
    BARON, E
    ORBACH, R
    AMIT, G
    LECTURE NOTES IN COMPUTER SCIENCE, 1987, 286 : 307 - 318
  • [8] CosUKG: A Representation Learning Framework for Uncertain Knowledge Graphs
    Shen, Qiuhui
    Qu, Aiyan
    MATHEMATICS, 2024, 12 (10)
  • [9] Multimodal Data Enhanced Representation Learning for Knowledge Graphs
    Wang, Zikang
    Li, Linjing
    Li, Qiudan
    Zeng, Daniel
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [10] REPRESENTATION LEARNING WITH COMPLETE SEMANTIC DESCRIPTION OF KNOWLEDGE GRAPHS
    Chen, Wenrui
    Luo, Chuyao
    Wang, Shaokai
    Ye, Yunming
    PROCEEDINGS OF 2017 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS (ICMLC), VOL 1, 2017, : 143 - 149