KTAT: A Complex Embedding Model of Knowledge Graph Integrating Type Information and Attention Mechanism

被引:0
|
作者
Liu, Ying [1 ,2 ]
Wang, Peng [1 ,3 ]
Yang, Di [1 ]
机构
[1] Changchun Univ Sci & Technol, Sch Comp Sci & Technol, Changchun 130022, Peoples R China
[2] Tonghua Normal Univ, Sch Comp Sci, Tonghua 134002, Peoples R China
[3] Changchun Univ Sci & Technol, Chongqing Res Inst, Chongqing 401120, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 13期
关键词
artificial intelligence; deep learning; knowledge graph embedding; graph attention mechanism; link prediction;
D O I
10.3390/app13137924
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Knowledge graph embedding learning aims to represent the entities and relationships of real-world knowledge as low-dimensional dense vectors. Existing knowledge representation learning methods mostly aggregate only the internal information of triplets and graph structure information. Recent research has proved that multi-source information of entities is conducive to more accurate knowledge embedding tasks. In this paper, we propose a model based on an attention mechanism and integrating the type information of entities, named KTAT. This model is based on the graph attention mechanism, to distribute corresponding attention mechanisms according to different weights between nodes. We introduce a type-specific hyperplane, which enables entities to have different embedding representations according to their type in the current triplet. Simultaneously, we also used textual description information of entities to improve the performance of the model. We conducted link prediction experiments on the FB15k and FB15k-237 datasets. The experimental results show that our model outperforms previous advanced methods compared to some baseline models, and demonstrate that combining type information can effectively improve the performance of link prediction.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] A knowledge graph embedding model based attention mechanism for enhanced node information integration
    Liu, Ying
    Wang, Peng
    Yang, Di
    Qiu, Ningjia
    PEERJ COMPUTER SCIENCE, 2024, 10
  • [2] A deep embedding model for knowledge graph completion based on attention mechanism
    Jin Huang
    TingHua Zhang
    Jia Zhu
    Weihao Yu
    Yong Tang
    Yang He
    Neural Computing and Applications, 2021, 33 : 9751 - 9760
  • [3] A deep embedding model for knowledge graph completion based on attention mechanism
    Huang, Jin
    Zhang, TingHua
    Zhu, Jia
    Yu, Weihao
    Tang, Yong
    He, Yang
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (15): : 9751 - 9760
  • [4] BHRAM: a knowledge graph embedding model based on bidirectional and heterogeneous relational attention mechanism
    Zhang, Chaoqun
    Li, Wanqiu
    Mo, Yuanbin
    Tang, Weidong
    Li, Haoran
    Zeng, Zhilin
    APPLIED INTELLIGENCE, 2025, 55 (03)
  • [5] Knowledge Graph Embedding Combining with Hierarchical Type Information
    Zhang J.-D.
    Li J.
    Ruan Jian Xue Bao/Journal of Software, 2022, 33 (09):
  • [6] Item Recommendation Algorithm Integrating Knowledge Graph and Attention Mechanism
    Junye, Xing
    Xing, Xing
    Zhichun, Jia
    Hongda, Wang
    Jiawen, Liu
    Computer Engineering and Applications, 60 (10): : 173 - 179
  • [7] Embedding dynamic graph attention mechanism into Clinical Knowledge Graph for enhanced diagnostic accuracy
    Chen, Deng
    Zhang, Weiwei
    Ding, Zuohua
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 267
  • [8] Knowledge Graph Embedding Model Based on k -Order Sampling and Graph Attention Networks
    Liu, Wenjie
    Yao, Junfei
    Chen, Liang
    Computer Engineering and Applications, 2024, 60 (02) : 113 - 120
  • [9] A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion
    Shang, Bin
    Zhao, Yinliang
    Liu, Jun
    Liu, Yifan
    Wang, Chenxin
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (20): : 15005 - 15018
  • [10] Attention-Based Direct Interaction Model for Knowledge Graph Embedding
    Zhou, Bo
    Chen, Yubo
    Liu, Kang
    Zhao, Jun
    SEMANTIC TECHNOLOGY, JIST 2019, 2020, 1157 : 100 - 108