Simple Knowledge Graph Completion Model Based on Differential Negative Sampling and Prompt Learning

被引:0
|
作者
Duan, Li [1 ]
Wang, Jing [1 ]
Luo, Bing [1 ]
Sun, Qiao [1 ,2 ]
机构
[1] Naval Univ Engn, Coll Elect Engn, Wuhan 430033, Peoples R China
[2] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
关键词
natural language processing; knowledge graph completion; prompt learning; positive unlabeled learning;
D O I
10.3390/info14080450
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Knowledge graphs (KGs) serve as a crucial resource for numerous artificial intelligence tasks, significantly contributing to the advancement of the AI field. However, the incompleteness of existing KGs hinders their effectiveness in practical applications. Consequently, researchers have proposed the task of KG completion. Currently, embedding-based techniques dominate the field as they leverage the structural information within KGs to infer and complete missing parts. Nonetheless, these methods exhibit limitations. They are limited by the quality and quantity of structural information and are unable to handle the missing entities in the original KG. To overcome these challenges, researchers have attempted to integrate pretrained language models and textual data to perform KG completion. This approach utilizes the definition statements and description text of entities within KGs. The goal is to compensate for the latent connections that are difficult for traditional methods to obtain. However, text-based methods still lag behind embedding-based models in terms of performance. Our analysis reveals that the critical issue lies in the selection process of negative samples. In order to enhance the performance of the text-based methods, various types of negative sampling methods are employed in this study. We introduced prompt learning to fill the gap between the pre-training language model and the knowledge graph completion task, and to improve the model reasoning level. Simultaneously, a ranking strategy based on KG structural information is proposed to utilize KG structured data to assist reasoning. The experiment results demonstrate that our model exhibits strong competitiveness and outstanding inference speed. By fully exploiting the internal structural information of KGs and external relevant descriptive text resources, we successfully elevate the performance levels of KG completion tasks across various metrics.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] Heterogeneous Graph Neural Network Knowledge Graph Completion Model Based on Improved Attention Mechanism
    Shi, Junkang
    Li, Ming
    Zhao, Jing
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT IV, 2023, 14257 : 423 - 434
  • [32] Knowledge graph completion based on graph contrastive attention network
    Liu D.
    Fang Q.
    Zhang X.
    Hu J.
    Qian S.
    Xu C.
    Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics, 2022, 48 (08): : 1428 - 1435
  • [33] An Aggregation Procedure Enhanced Mechanism for GCN-Based Knowledge Graph Completion Model by Leveraging Condensed Sampling and Attention Optimization
    Wang, Yashen
    Ouyang, Xiaoye
    Zhu, Xiaoling
    Guo, Dayu
    Zhang, Yi
    WEB AND BIG DATA, APWEB-WAIM 2024, PT II, 2024, 14962 : 341 - 356
  • [34] MRE: A translational knowledge graph completion model based on multiple relation embedding
    Lu, Xinyu
    Wang, Lifang
    Jiang, Zejun
    Liu, Shizhong
    Lin, Jiashi
    MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2023, 20 (03) : 5881 - 5900
  • [35] An Improved Capsule Network-based Embedding Model for Knowledge Graph Completion
    Li, Jun
    Hou, Jie
    Zhou, Chunyu
    PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 2247 - 2251
  • [36] Enhancing Domain-Specific Knowledge Graph Reasoning via Metapath-Based Large Model Prompt Learning
    Ding, Ruidong
    Zhou, Bin
    ELECTRONICS, 2025, 14 (05):
  • [37] GFedKG: GNN-based federated embedding model for knowledge graph completion
    Wang, Yuzhuo
    Wang, Hongzhi
    Liu, Xianglong
    Yan, Yu
    KNOWLEDGE-BASED SYSTEMS, 2024, 301
  • [38] DSKG: A Deep Sequential Model for Knowledge Graph Completion
    Guo, Lingbing
    Zhang, Qingheng
    Ge, Weiyi
    Hu, Wei
    Qu, Yuzhong
    KNOWLEDGE GRAPH AND SEMANTIC COMPUTING: KNOWLEDGE COMPUTING AND LANGUAGE UNDERSTANDING (CCKS 2018), 2019, 957 : 65 - 77
  • [39] Graph2Seq: Fusion Embedding Learning for Knowledge Graph Completion
    Li, Weidong
    Zhang, Xinyu
    Wang, Yaqian
    Yan, Zhihuan
    Peng, Rong
    IEEE ACCESS, 2019, 7 : 157960 - 157971
  • [40] Disentangled Relational Graph Neural Network with Contrastive Learning for knowledge graph completion
    Yin, Hong
    Zhong, Jiang
    Li, Rongzhen
    Li, Xue
    KNOWLEDGE-BASED SYSTEMS, 2024, 295