A prompt tuning method based on relation graphs for few-shot relation extraction

被引:0
|
作者
Zhang, Zirui [1 ]
Yang, Yiyu [2 ]
Chen, Benhui [2 ,3 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Nanjing 210016, Jiangsu, Peoples R China
[2] Dali Univ, Dali 671000, Yunnan, Peoples R China
[3] Lijiang Normal Coll, Lijiang 674100, Yunnan, Peoples R China
关键词
Relation extraction; Knowledge graph; Few-shot; Prompt tuning; Relation graph;
D O I
10.1016/j.neunet.2025.107214
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Prompt-tuning has recently proven effective in addressing few-shot tasks. However, task resources remain severely limited in the specific domain of few-shot relation extraction. Despite its successes, prompt-tuning faces challenges distinguishing between similar relations, resulting in occasional prediction errors. Therefore, it is critical to extract maximum information from these scarce resources. This paper introduces the integration of global relation graphs and local relation subgraphs into the prompt-tuning framework to tackle this issue and fully exploit the available resources for differentiating between various relations. A global relation graph is initially constructed to enhance feature representations of samples across different relations based on label consistency. Subsequently, this global relation graph is partitioned to create local relation subgraphs for each relation type, optimizing the feature representations of samples within the same relation. This dual approach effectively utilizes the limited supervised information and improves tuning efficiency. Additionally, recognizing the substantial semantic knowledge embedded in relation labels, this study integrates such knowledge into the prompt-tuning process. Extensive experiments conducted on four low-resource datasets validate the efficacy of the proposed method, demonstrating significant performance improvements. Notably, the model also exhibits robust performance in discerning similar relations.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] A lightweight approach based on prompt for few-shot relation extraction
    Zhang, Ying
    Huang, Wencheng
    Dang, Depeng
    COMPUTER SPEECH AND LANGUAGE, 2024, 84
  • [2] Knowledge-enhanced meta-prompt for few-shot relation extraction
    Cui, Jinman
    Xu, Fu
    Wang, Xinyang
    Li, Yakun
    Qu, Xiaolong
    Yao, Lei
    Li, Dongmei
    COMPUTER SPEECH AND LANGUAGE, 2025, 91
  • [3] Virtual prompt pre-training for prototype-based few-shot relation extraction
    He, Kai
    Huang, Yucheng
    Mao, Rui
    Gong, Tieliang
    Li, Chen
    Cambria, Erik
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 213
  • [4] Few-shot medical relation extraction via prompt tuning enhanced pre-trained language model
    He, Guoxiu
    Huang, Chen
    NEUROCOMPUTING, 2025, 633
  • [5] Contextual Information Augmented Few-Shot Relation Extraction
    Wang, Tian
    Wang, Zhiguang
    Wang, Rongliang
    Li, Dawei
    Lu, Qiang
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, KSEM 2023, 2023, 14117 : 138 - 149
  • [6] Few-Shot Relation Extraction Towards Special Interests
    Fan, Siqi
    Zhang, Binbin
    Zhou, Silin
    Wang, Menghan
    Li, Ke
    BIG DATA RESEARCH, 2021, 26
  • [7] Bidirectional matching and aggregation network for few-shot relation extraction
    Wei, Zhongcheng
    Guo, Wenjie
    Zhang, Yunping
    Zhang, Jieying
    Zhao, Jijun
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [8] Bidirectional matching and aggregation network for few-shot relation extraction
    Wei Z.
    Guo W.
    Zhang Y.
    Zhang J.
    Zhao J.
    PeerJ Computer Science, 2023, 9
  • [9] Few-Shot Relation Extraction With Automatically Generated Prompts
    Zhao, Xiaoyan
    Yang, Min
    Qu, Qiang
    Xu, Ruifeng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (03) : 4971 - 4983
  • [10] Enhanced prototypical network for few-shot relation extraction
    Wen, Wen
    Liu, Yongbin
    Ouyang, Chunping
    Lin, Qiang
    Chung, Tonglee
    INFORMATION PROCESSING & MANAGEMENT, 2021, 58 (04)