Simple Knowledge Graph Completion Model Based on Differential Negative Sampling and Prompt Learning

被引:0
|
作者
Duan, Li [1 ]
Wang, Jing [1 ]
Luo, Bing [1 ]
Sun, Qiao [1 ,2 ]
机构
[1] Naval Univ Engn, Coll Elect Engn, Wuhan 430033, Peoples R China
[2] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
关键词
natural language processing; knowledge graph completion; prompt learning; positive unlabeled learning;
D O I
10.3390/info14080450
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Knowledge graphs (KGs) serve as a crucial resource for numerous artificial intelligence tasks, significantly contributing to the advancement of the AI field. However, the incompleteness of existing KGs hinders their effectiveness in practical applications. Consequently, researchers have proposed the task of KG completion. Currently, embedding-based techniques dominate the field as they leverage the structural information within KGs to infer and complete missing parts. Nonetheless, these methods exhibit limitations. They are limited by the quality and quantity of structural information and are unable to handle the missing entities in the original KG. To overcome these challenges, researchers have attempted to integrate pretrained language models and textual data to perform KG completion. This approach utilizes the definition statements and description text of entities within KGs. The goal is to compensate for the latent connections that are difficult for traditional methods to obtain. However, text-based methods still lag behind embedding-based models in terms of performance. Our analysis reveals that the critical issue lies in the selection process of negative samples. In order to enhance the performance of the text-based methods, various types of negative sampling methods are employed in this study. We introduced prompt learning to fill the gap between the pre-training language model and the knowledge graph completion task, and to improve the model reasoning level. Simultaneously, a ranking strategy based on KG structural information is proposed to utilize KG structured data to assist reasoning. The experiment results demonstrate that our model exhibits strong competitiveness and outstanding inference speed. By fully exploiting the internal structural information of KGs and external relevant descriptive text resources, we successfully elevate the performance levels of KG completion tasks across various metrics.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Simple knowledge graph completion model based on PU learning and prompt learning
    Li Duan
    Jing Wang
    Bing Luo
    Qiao Sun
    Knowledge and Information Systems, 2024, 66 : 2683 - 2697
  • [2] Simple knowledge graph completion model based on PU learning and prompt learning
    Duan, Li
    Wang, Jing
    Luo, Bing
    Sun, Qiao
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (04) : 2683 - 2697
  • [3] Relation-enhanced Negative Sampling for Multimodal Knowledge Graph Completion
    Xu, Derong
    Xu, Tong
    Wu, Shiwei
    Zhou, Jingbo
    Chen, Enhong
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 3857 - 3866
  • [4] A knowledge graph completion model based on contrastive learning and relation enhancement method
    Li, LinYu
    Zhang, Xuan
    Ma, YuBin
    Gao, Chen
    Wang, Jishu
    Yu, Yong
    Yuan, Zihao
    Ma, Qiuying
    KNOWLEDGE-BASED SYSTEMS, 2022, 256
  • [5] A knowledge graph completion model based on triple level interaction and contrastive learning
    Hu, Jie
    Yang, Hongqun
    Teng, Fei
    Du, Shengdong
    Li, Tianrui
    PATTERN RECOGNITION, 2024, 156
  • [6] Correlation embedding learning with dynamic semantic enhanced sampling for knowledge graph completion
    Haojie Nie
    Xiangguo Zhao
    Xin Bi
    Yuliang Ma
    George Y. Yuan
    World Wide Web, 2023, 26 : 2887 - 2907
  • [7] Correlation embedding learning with dynamic semantic enhanced sampling for knowledge graph completion
    Nie, Haojie
    Zhao, Xiangguo
    Bi, Xin
    Ma, Yuliang
    Yuan, George Y.
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2023, 26 (05): : 2887 - 2907
  • [8] Simplified Representation Learning Model Based on Parameter-Sharing for Knowledge Graph Completion
    Wang, Yashen
    Zhang, Huanhuan
    Li, Yifeng
    Xie, Haiyong
    INFORMATION RETRIEVAL (CCIR 2019), 2019, 11772 : 67 - 78
  • [9] A Simple But Powerful Graph Encoder for Temporal Knowledge Graph Completion
    Ding, Zifeng
    Ma, Yunpu
    He, Bailan
    Wu, Jingpei
    Han, Zhen
    Tresp, Volker
    INTELLIGENT SYSTEMS AND APPLICATIONS, VOL 3, INTELLISYS 2023, 2024, 824 : 729 - 747
  • [10] Knowledge Graph Completion Based on Contrastive Learning for Diet Therapy
    Yang, Kaidi
    Lin, Yangguang
    Mi, Xuanhan
    Li, Yuxun
    Lin, Xiao
    Li, Dongmei
    27TH IEEE/ACIS INTERNATIONAL SUMMER CONFERENCE ON SOFTWARE ENGINEERING ARTIFICIAL INTELLIGENCE NETWORKING AND PARALLEL/DISTRIBUTED COMPUTING, SNPD 2024-SUMMER, 2024, : 141 - 145