PTCAS: Prompt tuning with continuous answer search for relation extraction

被引:4
作者
Chen, Yang [1 ]
Shi, Bowen [2 ]
Xu, Ke [1 ]
机构
[1] Beihang Univ, State Key Lab Software Dev Environm, Beijing 100191, Peoples R China
[2] Commun Univ China, Sch Journalism, Beijing 100024, Peoples R China
关键词
Prompt tuning; Relation extraction; Few-shot learning; Pretrained language model;
D O I
10.1016/j.ins.2023.120060
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Tremendous progress has been made in the development of fine-tuned pretrained language models (PLMs) that achieve outstanding results on almost all natural language processing (NLP) tasks. Further stimulation of rich knowledge distribution within PLMs can be achieved through additional prompts for fine-tuning, namely, prompt tuning. Generally, prompt engineering involves prompt template engineering, which is the process of searching for an appropriate template for a specific task, and answer engineering, whose objective is to seek an answer space and map it to the original task label set. Existing prompt -based methods are primarily designed manually and search for appropriate verbalization in a discrete answer space, which is insufficient and always results in suboptimal performance for complex NLP tasks such as relation extraction (RE). Therefore, we propose a novel prompt -tuning method with a continuous answer search for RE, which enables the model to find optimal answer word representations in a continuous space through gradient descent and thus fully exploit the relation semantics. To further exploit entitytype information and integrate structured knowledge into our approach, we designed and added an additional TransH-based structured knowledge constraint to the optimization procedure. We conducted comprehensive experiments on four RE benchmarks to evaluate the effectiveness of the proposed approach. The experimental results show that our approach achieves competitive or superior performance without manual answer engineering compared to existing baselines under both fully supervised and low -resource scenarios.
引用
收藏
页数:10
相关论文
共 43 条
  • [41] NAS-CRE: Neural Architecture Search for Context-Based Relation Extraction
    Yan, Rongen
    Li, Dongmei
    Wu, Yan
    Dang, Depeng
    Tao, Ye
    Wang, Shaofei
    APPLIED SCIENCES-BASEL, 2024, 14 (23):
  • [42] FPrompt-PLM: Flexible-Prompt on Pretrained Language Model for Continual Few-Shot Relation Extraction
    Zhang, Lingling
    Li, Yifei
    Wang, Qianying
    Wang, Yun
    Yan, Hang
    Wang, Jiaxin
    Liu, Jun
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (12) : 8267 - 8282
  • [43] Reading Broadly to Open Your Mind: Improving Open Relation Extraction With Search Documents Under Self-Supervisions
    Hu, Xuming
    Hong, Zhaochen
    Zhang, Chenwei
    Liu, Aiwei
    Meng, Shiao
    Wen, Lijie
    King, Irwin
    Yu, Philip S.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (05) : 2026 - 2040