EwLoss: A Exponential Weighted Loss Function for Knowledge Graph Embedding Models

被引:0
作者
Shen, Qiuhui [1 ]
Zhang, Hongjun [2 ]
Liao, Chunlin [1 ]
机构
[1] Army Engn Univ PLA, Sch Grad, Nanjing 210007, Peoples R China
[2] Army Engn Univ PLA, Coll Command & Control Engn, Nanjing 210000, Peoples R China
关键词
Difficult-to-distinguish samples; easy-to-distinguish samples; knowledge graph; knowledge graph embedding model; loss function; negative triplets; negative sampling;
D O I
10.1109/ACCESS.2023.3322204
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Knowledge graphs have been widely used in various domains such as intelligent question answering, information retrieval, transportation, medicine, e-commerce, and others, and have achieved significant success. To support these applications with a high-quality vector representation, numerous studies have proposed various knowledge graph embedding (KGE) models. However, most of these models primarily focus on the design of new scoring functions, disregarding the crucial role of the loss function during the model training stage. In light of this, we propose a new exponential weight coefficient loss function, named EwLoss, which can be incorporated into any typical KGE model. This loss function enables the model to better distinguish between triplets that are otherwise difficult to differentiate. Consequently, the model can learn the features of these challenging triplets more effectively, leading to improved embedding vectors. By conducting experiments on four commonly used data sets, where EwLoss is applied to six representative models, we demonstrate the superiority of EwLoss in distinguishing such hard-to-distinguish triplets. Moreover, we observe that the performance of the embedding models is significantly enhanced when utilizing EwLoss.
引用
收藏
页码:110670 / 110680
页数:11
相关论文
共 66 条
  • [1] Balazevic I, 2019, 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019), P5185
  • [2] Balazevic I, 2019, ADV NEUR IN, V32
  • [3] Hypernetwork Knowledge Graph Embeddings
    Balazevic, Ivana
    Allen, Carl
    Hospedales, Timothy M.
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: WORKSHOP AND SPECIAL SESSIONS, 2019, 11731 : 553 - 565
  • [4] Bengio Y., 2011, P 25 AAAI C ART INT, P301, DOI DOI 10.1609/AAAI.V25I1.7917
  • [5] Bollacker Kurt, 2008, Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data, DOI [DOI 10.1145/1376616.1376746, 10.1145/1376616.1376746]
  • [6] Bordes A., 2012, Proc. of the 15th Int. Conf. on Artif. Intell. and Stat
  • [7] A semantic matching energy function for learning with multi-relational data Application to word-sense disambiguation
    Bordes, Antoine
    Glorot, Xavier
    Weston, Jason
    Bengio, Yoshua
    [J]. MACHINE LEARNING, 2014, 94 (02) : 233 - 259
  • [8] Bordes Antoine, 2013, PROCADV NEURAL INF P, V26
  • [9] Chami A., 2020, P 58 ANN M ASS COMP, P6901
  • [10] Chang Kai-Wei, 2014, P C EMP METH NAT LAN, P1568