Simple Knowledge Graph Completion Model Based on Differential Negative Sampling and Prompt Learning

被引:0
|
作者
Duan, Li [1 ]
Wang, Jing [1 ]
Luo, Bing [1 ]
Sun, Qiao [1 ,2 ]
机构
[1] Naval Univ Engn, Coll Elect Engn, Wuhan 430033, Peoples R China
[2] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
关键词
natural language processing; knowledge graph completion; prompt learning; positive unlabeled learning;
D O I
10.3390/info14080450
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Knowledge graphs (KGs) serve as a crucial resource for numerous artificial intelligence tasks, significantly contributing to the advancement of the AI field. However, the incompleteness of existing KGs hinders their effectiveness in practical applications. Consequently, researchers have proposed the task of KG completion. Currently, embedding-based techniques dominate the field as they leverage the structural information within KGs to infer and complete missing parts. Nonetheless, these methods exhibit limitations. They are limited by the quality and quantity of structural information and are unable to handle the missing entities in the original KG. To overcome these challenges, researchers have attempted to integrate pretrained language models and textual data to perform KG completion. This approach utilizes the definition statements and description text of entities within KGs. The goal is to compensate for the latent connections that are difficult for traditional methods to obtain. However, text-based methods still lag behind embedding-based models in terms of performance. Our analysis reveals that the critical issue lies in the selection process of negative samples. In order to enhance the performance of the text-based methods, various types of negative sampling methods are employed in this study. We introduced prompt learning to fill the gap between the pre-training language model and the knowledge graph completion task, and to improve the model reasoning level. Simultaneously, a ranking strategy based on KG structural information is proposed to utilize KG structured data to assist reasoning. The experiment results demonstrate that our model exhibits strong competitiveness and outstanding inference speed. By fully exploiting the internal structural information of KGs and external relevant descriptive text resources, we successfully elevate the performance levels of KG completion tasks across various metrics.
引用
收藏
页数:15
相关论文
共 50 条
  • [21] Knowledge graph completion model based on hyperbolic hierarchical attention network
    Luo, Jiaohuang
    Song, Changlong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (09) : 3893 - 3909
  • [22] A deep embedding model for knowledge graph completion based on attention mechanism
    Huang, Jin
    Zhang, TingHua
    Zhu, Jia
    Yu, Weihao
    Tang, Yong
    He, Yang
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (15) : 9751 - 9760
  • [23] A deep embedding model for knowledge graph completion based on attention mechanism
    Jin Huang
    TingHua Zhang
    Jia Zhu
    Weihao Yu
    Yong Tang
    Yang He
    Neural Computing and Applications, 2021, 33 : 9751 - 9760
  • [24] A Dynamic Convolutional Network-Based Model for Knowledge Graph Completion
    Peng, Haoliang
    Wu, Yue
    INFORMATION, 2022, 13 (03)
  • [25] A path-based relation networks model for knowledge graph completion
    Lee, Wan-Kon
    Shin, Won-Chul
    Jagvaral, Batselem
    Roh, Jae-Seung
    Kim, Min-Sung
    Lee, Min-Ho
    Park, Hyun-Kyu
    Park, Young-Tack
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 182
  • [26] A Cybersecurity Knowledge Graph Completion Method Based on Ensemble Learning and Adversarial Training
    Wang, Peng
    Liu, Jingju
    Hou, Dongdong
    Zhou, Shicheng
    APPLIED SCIENCES-BASEL, 2022, 12 (24):
  • [27] A dynamic graph attention network with contrastive learning for knowledge graph completion
    Xujiang Li
    Jie Hu
    Jingling Wang
    Tianrui Li
    World Wide Web, 2025, 28 (4)
  • [28] Learning Entity Type Embeddings for Knowledge Graph Completion
    Moon, Changsung
    Jones, Paul
    Samatova, Nagiza F.
    CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 2215 - 2218
  • [29] One-shot knowledge graph completion based on disentangled representation learning
    Zhang, Youmin
    Sun, Lei
    Wang, Ye
    Liu, Qun
    Liu, Li
    Neural Computing and Applications, 2024, 36 (32) : 20277 - 20293
  • [30] An Embedding Model for Knowledge Graph Completion Based on Graph Sub-Hop Convolutional Network
    He, Haitao
    Niu, Haoran
    Feng, Jianzhou
    Nie, Junlan
    Zhang, Yangsen
    Ren, Jiadong
    BIG DATA RESEARCH, 2022, 30