Sentence-graph-level knowledge injection with multi-task learning

被引:0
|
作者
Chen, Liyi [1 ]
Wang, Runze [2 ]
Shi, Chen [2 ]
Yuan, Yifei [3 ]
Liu, Jie [1 ]
Hu, Yuxiang [2 ]
Jiang, Feijun [2 ]
机构
[1] Nankai Univ, Coll Artificial Intelligence, Tianjin, Peoples R China
[2] Alibaba Grp, Hangzhou, Peoples R China
[3] Chinese Univ Hong Kong, Hong Kong, Peoples R China
来源
WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS | 2025年 / 28卷 / 01期
基金
中国国家自然科学基金;
关键词
Language representation learning; Knowledge graph; Knowledge injection; Multi-task learning;
D O I
10.1007/s11280-025-01329-z
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Language representation learning is a fundamental task for natural language understanding. It aims to represent natural language sentences and classify their mentioned entities and relations, which usually requires injecting external entity and relation knowledge into sentence representation. Existing methods typically inject factual knowledge into pre-trained language models by sequentially concatenating knowledge behind the sentence, with less attention to the structured information from the knowledge graph and the interactive relationship within. In this paper, we learn the sentence representation based on both Sentence- and Graph- level knowledge at the fine-tuning stage with a multi-task learning framework (SenGraph). At sentence-level, we concatenate factual knowledge with the sentence by a sequential structure, and train it with a sentence-level task. At the graph-level, we construct all the knowledge and sentence information as a graph, and introduce a relational GAT to inject useful knowledge into sentences selectively. Meanwhile, we design two graph-based auxiliary tasks to align the heterogeneous embedding space between the natural language sentence and the knowledge graph. We evaluate our model on four knowledge-driven benchmark datasets. The experimental results demonstrate the effectiveness of the proposed method using less computational resources.
引用
收藏
页数:20
相关论文
共 50 条
  • [21] MTLAN: Multi-Task Learning and Auxiliary Network for Enhanced Sentence Embedding
    Liu, Gang
    Wang, Tongli
    Yang, Wenli
    Yan, Zhizheng
    Zhan, Kai
    NEURAL INFORMATION PROCESSING, ICONIP 2023, PT III, 2024, 14449 : 16 - 27
  • [22] SEBGM: Sentence Embedding Based on Generation Model with multi-task learning
    Wang, Qian
    Zhang, Weiqi
    Lei, Tianyi
    Cao, Yu
    Peng, Dezhong
    Wang, Xu
    COMPUTER SPEECH AND LANGUAGE, 2024, 87
  • [23] Open knowledge base canonicalization with multi-task learning
    Liu, Bingchen
    Peng, Huang
    Zeng, Weixin
    Zhao, Xiang
    Liu, Shijun
    Pan, Li
    Li, Xin
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2024, 27 (05):
  • [24] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369
  • [25] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [26] Knowledge triple mining via multi-task learning
    Zhang, Zhao
    Zhuang, Fuzhen
    Li, Xuebing
    Niu, Zheng-Yu
    He, Jia
    He, Qing
    Xiong, Hui
    INFORMATION SYSTEMS, 2019, 80 : 64 - 75
  • [27] A Multi-Task Representation Learning Architecture for Enhanced Graph Classification
    Xie, Yu
    Gong, Maoguo
    Gao, Yuan
    Qin, A. K.
    Fan, Xiaolong
    FRONTIERS IN NEUROSCIENCE, 2020, 13
  • [28] Multi-task heterogeneous graph learning on electronic health records
    Chan, Tsai Hor
    Yin, Guosheng
    Bae, Kyongtae
    Yu, Lequan
    NEURAL NETWORKS, 2024, 180
  • [29] Adaptive dual graph regularization for clustered multi-task learning
    Liu, Cheng
    Li, Rui
    Chen, Sentao
    Zheng, Lin
    Jiang, Dazhi
    NEUROCOMPUTING, 2024, 574
  • [30] Usr-mtl: an unsupervised sentence representation learning framework with multi-task learning
    Wenshen Xu
    Shuangyin Li
    Yonghe Lu
    Applied Intelligence, 2021, 51 : 3506 - 3521