PTCAS: Prompt tuning with continuous answer search for relation extraction

被引:4
作者
Chen, Yang [1 ]
Shi, Bowen [2 ]
Xu, Ke [1 ]
机构
[1] Beihang Univ, State Key Lab Software Dev Environm, Beijing 100191, Peoples R China
[2] Commun Univ China, Sch Journalism, Beijing 100024, Peoples R China
关键词
Prompt tuning; Relation extraction; Few-shot learning; Pretrained language model;
D O I
10.1016/j.ins.2023.120060
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Tremendous progress has been made in the development of fine-tuned pretrained language models (PLMs) that achieve outstanding results on almost all natural language processing (NLP) tasks. Further stimulation of rich knowledge distribution within PLMs can be achieved through additional prompts for fine-tuning, namely, prompt tuning. Generally, prompt engineering involves prompt template engineering, which is the process of searching for an appropriate template for a specific task, and answer engineering, whose objective is to seek an answer space and map it to the original task label set. Existing prompt -based methods are primarily designed manually and search for appropriate verbalization in a discrete answer space, which is insufficient and always results in suboptimal performance for complex NLP tasks such as relation extraction (RE). Therefore, we propose a novel prompt -tuning method with a continuous answer search for RE, which enables the model to find optimal answer word representations in a continuous space through gradient descent and thus fully exploit the relation semantics. To further exploit entitytype information and integrate structured knowledge into our approach, we designed and added an additional TransH-based structured knowledge constraint to the optimization procedure. We conducted comprehensive experiments on four RE benchmarks to evaluate the effectiveness of the proposed approach. The experimental results show that our approach achieves competitive or superior performance without manual answer engineering compared to existing baselines under both fully supervised and low -resource scenarios.
引用
收藏
页数:10
相关论文
共 43 条
  • [21] Knowledge-enhanced meta-prompt for few-shot relation extraction
    Cui, Jinman
    Xu, Fu
    Wang, Xinyang
    Li, Yakun
    Qu, Xiaolong
    Yao, Lei
    Li, Dongmei
    COMPUTER SPEECH AND LANGUAGE, 2025, 91
  • [22] Virtual prompt pre-training for prototype-based few-shot relation extraction
    He, Kai
    Huang, Yucheng
    Mao, Rui
    Gong, Tieliang
    Li, Chen
    Cambria, Erik
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 213
  • [23] CoPrompt: A Contrastive-prompt Tuning Method for Multiparty Dialogue Character Relationship Extraction
    Li, Yu
    Jiang, Yuru
    Chen, Jie
    Wang, Liangguo
    Tao, Yuyang
    Zhang, Yangsen
    PROCEEDINGS OF 2023 7TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING AND INFORMATION RETRIEVAL, NLPIR 2023, 2023, : 153 - 160
  • [24] Biomedical document relation extraction with prompt learning and KNN
    Zhao, Di
    Yang, Yumeng
    Chen, Peng
    Meng, Jiana
    Sun, Shichang
    Wang, Jian
    Lin, Hongfei
    JOURNAL OF BIOMEDICAL INFORMATICS, 2023, 145
  • [25] Relation Extraction Based on Prompt Information and Feature Reuse
    Feng, Ping
    Zhang, Xin
    Zhao, Jian
    Wang, Yingying
    Huang, Biao
    DATA INTELLIGENCE, 2023, 5 (03) : 824 - 840
  • [26] Prompt-Based Prototypical Framework for Continual Relation Extraction
    Zhang, Han
    Liang, Bin
    Yang, Min
    Wang, Hui
    Xu, Ruifeng
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2022, 30 : 2801 - 2813
  • [27] A lightweight approach based on prompt for few-shot relation extraction
    Zhang, Ying
    Huang, Wencheng
    Dang, Depeng
    COMPUTER SPEECH AND LANGUAGE, 2024, 84
  • [28] Dynamic Prompt-Driven Zero-Shot Relation Extraction
    Xu, Liang
    Bu, Xiaoxuan
    Tian, Xuetao
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 2900 - 2912
  • [29] Improving Distantly-Supervised Relation Extraction through Label Prompt
    Lin, Guangyu
    Zhang, Hongbin
    Fan, Zhenyi
    Cheng, Lianglun
    Wang, Zhuowei
    Chen, Chong
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 606 - 611
  • [30] A recollect-tuning method for entity and relation extraction
    Wu, Yizhao
    Chen, Yanping
    Qin, Yongbin
    Tang, Ruixue
    Zheng, Qinghua
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 245