Text-enhanced knowledge graph representation learning with local structure

被引:4
|
作者
Li, Zhifei [1 ,2 ,3 ,4 ]
Jian, Yue [1 ]
Xue, Zengcan [5 ]
Zheng, Yumin [5 ]
Zhang, Miao [1 ,3 ,4 ]
Zhang, Yan [1 ,3 ,4 ]
Hou, Xiaoju [6 ]
Wang, Xiaoguang [2 ,7 ]
机构
[1] Hubei Univ, Sch Comp Sci & Informat Engn, Wuhan 430062, Peoples R China
[2] Wuhan Univ, Intellectual Comp Lab Cultural Heritage, Wuhan 430072, Peoples R China
[3] Hubei Univ, Key Lab Intelligent Sensing Syst & Secur, Minist Educ, Wuhan 430062, Peoples R China
[4] Hubei Univ, Hubei Key Lab Big Data Intelligent Anal & Applicat, Wuhan 430062, Peoples R China
[5] Cent China Normal Univ, Fac Artificial Intelligence Educ, Wuhan 430079, Hubei, Peoples R China
[6] Guangdong Ind Polytech, Inst Vocat Educ, Guangzhou 510300, Peoples R China
[7] Wuhan Univ, Sch Informat Management, Wuhan 430072, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge graph; Representation learning; Text encoder; Link prediction; EMBEDDINGS;
D O I
10.1016/j.ipm.2024.103797
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Knowledge graph representation learning entails transforming entities and relationships within a knowledge graph into vectors to enhance downstream tasks. The rise of pre -trained language models has recently promoted text -based approaches for knowledge graph representation learning. However, these methods often need more structural information on knowledge graphs, prompting the challenge of integrating graph structure knowledge into text -based methodologies. To tackle this issue, we introduce a text -enhanced model with local structure (TEGS) that embeds local graph structure details from the knowledge graph into the text encoder. TEGS integrates k -hop neighbor entity information into the text encoder and employs a decoupled attention mechanism to blend relative position encoding and text semantics. This strategy augments learnable content through graph structure information and mitigates the impact of semantic ambiguity via the decoupled attention mechanism. Experimental findings demonstrate TEGS's effectiveness at fusing graph structure information, resulting in state-ofthe-art performance across three datasets in link prediction tasks. In terms of Hit@1, when compared to the previous text -based models, our model demonstrated improvements of 2.1% on WN18RR, 2.4% on FB15k-237, and 2.7% on the NELL-One dataset. Our code is made publicly available on
引用
收藏
页数:16
相关论文
共 50 条
  • [21] Fusing structural information with knowledge enhanced text representation for knowledge graph completion
    Tang, Kang
    Li, Shasha
    Tang, Jintao
    Li, Dong
    Wang, Pancheng
    Wang, Ting
    DATA MINING AND KNOWLEDGE DISCOVERY, 2024, 38 (03) : 1316 - 1333
  • [22] BCRL: Long Text Friendly Knowledge Graph Representation Learning
    Wu, Gang
    Wu, Wenfang
    Li, Leilei
    Zhao, Guodong
    Han, Donghong
    Qiao, Baiyou
    SEMANTIC WEB - ISWC 2020, PT I, 2020, 12506 : 636 - 653
  • [23] Text-Enhanced Graph Attention Hashing for Cross-Modal Retrieval
    Zou, Qiang
    Cheng, Shuli
    Du, Anyu
    Chen, Jiayi
    ENTROPY, 2024, 26 (11)
  • [24] Semantic Communication Enhanced by Knowledge Graph Representation Learning
    Hello, Nour
    Di Lorenzo, Paolo
    Strinati, Emilio Calvanese
    2024 IEEE 25TH INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS, SPAWC 2024, 2024, : 876 - 880
  • [25] Knowledge-enhanced Spherical Representation Learning for Text Classification
    Ennajari, Hafsa
    Bouguila, Nizar
    Bentahar, Jamal
    PROCEEDINGS OF THE 2022 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2022, : 639 - 647
  • [26] Temporal knowledge graph representation learning with local and global evolutions
    Zhang, Jiasheng
    Liang, Shuang
    Sheng, Yongpan
    Shao, Jie
    KNOWLEDGE-BASED SYSTEMS, 2022, 251
  • [27] JointGT: Graph-Text Joint Representation Learning for Text Generation from Knowledge Graphs
    Ke, Pei
    Ji, Haozhe
    Ran, Yu
    Cui, Xin
    Wang, Liwei
    Song, Linfeng
    Zhu, Xiaoyan
    Huang, Minlie
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2526 - 2538
  • [28] Exploiting Structured Knowledge in Text via Graph-Guided Representation Learning
    Shen, Tao
    Mao, Yi
    He, Pengcheng
    Long, Guodong
    Trischler, Adam
    Chen, Weizhu
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 8980 - 8994
  • [29] Local structure-aware graph contrastive representation learning
    Yang, Kai
    Liu, Yuan
    Zhao, Zijuan
    Ding, Peijin
    Zhao, Wenqian
    NEURAL NETWORKS, 2024, 172
  • [30] Type-Enhanced Temporal Knowledge Graph Representation Learning Model
    He P.
    Zhou G.
    Chen J.
    Zhang M.
    Ning Y.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2023, 60 (04): : 916 - 929