SFP: temporal knowledge graph completion based on sequence-focus patterns representation learning

被引:0
|
作者
Wang, Jingbin [1 ]
Ke, Xifan [1 ]
Zhang, Fuyuan [1 ]
Wu, Yuwei [1 ]
Zhang, Sirui [1 ]
Guo, Kun [1 ]
机构
[1] Fuzhou Univ, Coll Comp & Data Sci, Fuzhou 350108, Fujian, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge graph completion; Graph Attention Network; Link prediction; Temporal knowledge graphs;
D O I
10.1007/s10489-025-06306-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The extrapolation task in the temporal knowledge graph has received increasing attention from scholars due to its wide range of practical application scenarios. At present, recurrent neural networks are currently widely used in temporal knowledge graph completion techniques. These networks are employed to depict the sequential pattern of entities and relations. However, as the sequence lengthens, some critical early information may become diluted. Prediction errors ensue in the completion task as a result. Furthermore, it is observed that existing temporal knowledge graph completion methods fail to account for the topological structure of relations, which leads to relation representations with essentially little distinction across different timestamps. In order to tackle the previously mentioned concern, our research introduces a Temporal Knowledge Graph Completion Method utilizing Sequence-Focus Patterns Representation Learning (SFP). This method contains two patterns: the Focus pattern and the Sequential pattern. In the SFP model, we developed a novel graph attention network called ConvGAT. This network efficiently distinguishes and extracts complex relation information, thereby enhancing the accuracy of entity representations that are aggregated in the Focus pattern and Sequential pattern. Furthermore we proposed RelGAT, a graph attention network that simulates the topological structure of relations. This enhances the precision of relation representations and facilitates the differentiation between relation embeddings generated at various timestamps in the Focus pattern. Utilizing a time-aware attention mechanism, the Focus pattern extracts vital information at particular timestamps in order to amplify the data that the Sequential pattern dilutes. On five distinct benchmark datasets, SFP significantly outperforms the baseline, according to a comprehensive series of experiments.
引用
收藏
页数:18
相关论文
共 50 条
  • [41] GRL: Knowledge graph completion with GAN-based reinforcement learning
    Wang, Qi
    Ji, Yuede
    Hao, Yongsheng
    Cao, Jie
    KNOWLEDGE-BASED SYSTEMS, 2020, 209
  • [42] Inductive Learning on Commonsense Knowledge Graph Completion
    Wang, Bin
    Wang, Guangtao
    Huang, Jing
    You, Jiaxuan
    Leskovec, Jure
    Kuo, C-C Jay
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [43] Research on Knowledge Graph Completion Based upon Knowledge Graph Embedding
    Feng, Tuoyu
    Wu, Yongsheng
    Li, Libing
    2024 9TH INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION SYSTEMS, ICCCS 2024, 2024, : 1335 - 1342
  • [44] Fusing structural information with knowledge enhanced text representation for knowledge graph completion
    Kang Tang
    Shasha Li
    Jintao Tang
    Dong Li
    Pancheng Wang
    Ting Wang
    Data Mining and Knowledge Discovery, 2024, 38 : 1316 - 1333
  • [45] Fusing structural information with knowledge enhanced text representation for knowledge graph completion
    Tang, Kang
    Li, Shasha
    Tang, Jintao
    Li, Dong
    Wang, Pancheng
    Wang, Ting
    DATA MINING AND KNOWLEDGE DISCOVERY, 2024, 38 (03) : 1316 - 1333
  • [46] Tensor decompositions for temporal knowledge graph completion with time perspective
    Yang, Jinfa
    Ying, Xianghua
    Shi, Yongjie
    Xing, Bowei
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 237
  • [47] Time in a Box: Advancing Knowledge Graph Completion with Temporal Scopes
    Cai, Ling
    Janowicz, Krzysztof
    Yan, Bo
    Zhu, Rui
    Mai, Gengchen
    PROCEEDINGS OF THE 11TH KNOWLEDGE CAPTURE CONFERENCE (K-CAP '21), 2021, : 121 - 128
  • [48] Few-Shot Knowledge Graph Completion Model Based on Relation Learning
    Li, Weijun
    Gu, Jianlai
    Li, Ang
    Gao, Yuxiao
    Zhang, Xinyong
    APPLIED SCIENCES-BASEL, 2023, 13 (17):
  • [49] A Cybersecurity Knowledge Graph Completion Method Based on Ensemble Learning and Adversarial Training
    Wang, Peng
    Liu, Jingju
    Hou, Dongdong
    Zhou, Shicheng
    APPLIED SCIENCES-BASEL, 2022, 12 (24):
  • [50] A dynamic graph attention network with contrastive learning for knowledge graph completion
    Xujiang Li
    Jie Hu
    Jingling Wang
    Tianrui Li
    World Wide Web, 2025, 28 (4)