EwLoss: A Exponential Weighted Loss Function for Knowledge Graph Embedding Models

被引:0
作者
Shen, Qiuhui [1 ]
Zhang, Hongjun [2 ]
Liao, Chunlin [1 ]
机构
[1] Army Engn Univ PLA, Sch Grad, Nanjing 210007, Peoples R China
[2] Army Engn Univ PLA, Coll Command & Control Engn, Nanjing 210000, Peoples R China
关键词
Difficult-to-distinguish samples; easy-to-distinguish samples; knowledge graph; knowledge graph embedding model; loss function; negative triplets; negative sampling;
D O I
10.1109/ACCESS.2023.3322204
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Knowledge graphs have been widely used in various domains such as intelligent question answering, information retrieval, transportation, medicine, e-commerce, and others, and have achieved significant success. To support these applications with a high-quality vector representation, numerous studies have proposed various knowledge graph embedding (KGE) models. However, most of these models primarily focus on the design of new scoring functions, disregarding the crucial role of the loss function during the model training stage. In light of this, we propose a new exponential weight coefficient loss function, named EwLoss, which can be incorporated into any typical KGE model. This loss function enables the model to better distinguish between triplets that are otherwise difficult to differentiate. Consequently, the model can learn the features of these challenging triplets more effectively, leading to improved embedding vectors. By conducting experiments on four commonly used data sets, where EwLoss is applied to six representative models, we demonstrate the superiority of EwLoss in distinguishing such hard-to-distinguish triplets. Moreover, we observe that the performance of the embedding models is significantly enhanced when utilizing EwLoss.
引用
收藏
页码:110670 / 110680
页数:11
相关论文
共 66 条
  • [21] Ji GL, 2016, AAAI CONF ARTIF INTE, P985
  • [22] Ji GL, 2015, PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1, P687
  • [23] Jiang XT, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P978
  • [24] Jin WJ, 2020, Arxiv, DOI arXiv:1904.05530
  • [25] Kazemi SM, 2018, ADV NEUR IN, V31
  • [26] Deriving Validity Time in Knowledge Graph
    Leblay, Julien
    Chekol, Melisachew Wudage
    [J]. COMPANION PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2018 (WWW 2018), 2018, : 1771 - 1776
  • [27] DBpedia - A large-scale, multilingual knowledge base extracted from Wikipedia
    Lehmann, Jens
    Isele, Robert
    Jakob, Max
    Jentzsch, Anja
    Kontokostas, Dimitris
    Mendes, Pablo N.
    Hellmann, Sebastian
    Morsey, Mohamed
    van Kleef, Patrick
    Auer, Soeren
    Bizer, Christian
    [J]. SEMANTIC WEB, 2015, 6 (02) : 167 - 195
  • [28] Focal Loss for Dense Object Detection
    Lin, Tsung-Yi
    Goyal, Priya
    Girshick, Ross
    He, Kaiming
    Dollar, Piotr
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (02) : 318 - 327
  • [29] Lin Y., 2015, P 2015 C EMP METH NA, P705, DOI 10.18653/V1/D15-1082
  • [30] Lin YK, 2015, AAAI CONF ARTIF INTE, P2181