Graph Attention Mechanism with Cardinality Preservation for Knowledge Graph Completion

被引:1
|
作者
Ding, Cong [1 ]
Wei, Xiao [1 ]
Chen, Yongqi [1 ]
Zhao, Rui [1 ]
机构
[1] Shanghai Univ, Sch Comp Engn & Sci, Shanghai, Peoples R China
来源
KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I | 2021年 / 12815卷
关键词
Cardinality preservation; Knowledge graph; Graph attention mechanism;
D O I
10.1007/978-3-030-82136-4_39
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Embedding knowledge graph with graph attention network has become a novel research topic in the field of knowledge graph completion. However, the current graph attention network generates the same embeddings for different structures and different entities when generating entity embeddings for knowledge graph. The quality of embedding directly contributes the effective of completion. We analyze the reason why graph attention network cannot distinguish structure, because the aggregation based on attention GNN ignores the cardinality information, which is the mapping of diverse features, and helps to distinguish the contributions of different nodes in the domain. Therefore, we propose the graph attention preserving (KBCPA) model. Cardinality information is added into the attentional mechanism based aggregation to generate different representations for different entities, thus improving the discrimination ability of the model. Our experiments present that our model is effective and competitive, and it obtains better performance than previous state-of-the-art embedding models for knowledge graph completion on two benchmark datasets WN18RR and FB15k-237.
引用
收藏
页码:479 / 490
页数:12
相关论文
共 50 条
  • [1] Improving Attention Mechanism in Graph Neural Networks via Cardinality Preservation
    Zhang, Shuo
    Xie, Lei
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 1395 - 1402
  • [2] GAFM: A Knowledge Graph Completion Method Based on Graph Attention Faded Mechanism
    Ma, Jiangtao
    Li, Duanyang
    Zhu, Haodong
    Li, Chenliang
    Zhang, Qiuwen
    Qiao, Yaqiong
    INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (05)
  • [3] GAFM: A Knowledge Graph Completion Method Based on Graph Attention Faded Mechanism
    Ma, Jiangtao
    Li, Duanyang
    Zhu, Haodong
    Li, Chenliang
    Zhang, Qiuwen
    Qiao, Yaqiong
    Information Processing and Management, 2022, 59 (05):
  • [4] Translating Embeddings for Knowledge Graph Completion with Relation Attention Mechanism
    Qian, Wei
    Fu, Cong
    Zhu, Yu
    Cai, Deng
    He, Xiaofei
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 4286 - 4292
  • [5] Hyperbolic hierarchical graph attention network for knowledge graph completion
    Xu, Hao
    Chen, Shudong
    Qi, Donglin
    Tong, Da
    Yu, Yong
    Chen, Shuai
    High Technology Letters, 2024, 30 (03) : 271 - 279
  • [6] Hierarchical Perceptual Graph Attention Network for Knowledge Graph Completion
    Han, Wenhao
    Liu, Xuemei
    Zhang, Jianhao
    Li, Hairui
    ELECTRONICS, 2024, 13 (04)
  • [7] Hyperbolic hierarchical graph attention network for knowledge graph completion
    许浩
    CHEN Shudong
    QI Donglin
    TONG Da
    YU Yong
    CHEN Shuai
    High Technology Letters, 2024, 30 (03) : 271 - 279
  • [8] Knowledge graph completion based on graph contrastive attention network
    Liu D.
    Fang Q.
    Zhang X.
    Hu J.
    Qian S.
    Xu C.
    Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics, 2022, 48 (08): : 1428 - 1435
  • [9] Heterogeneous Graph Neural Network Knowledge Graph Completion Model Based on Improved Attention Mechanism
    Shi, Junkang
    Li, Ming
    Zhao, Jing
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT IV, 2023, 14257 : 423 - 434
  • [10] A deep embedding model for knowledge graph completion based on attention mechanism
    Huang, Jin
    Zhang, TingHua
    Zhu, Jia
    Yu, Weihao
    Tang, Yong
    He, Yang
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (15): : 9751 - 9760