PTCAS: Prompt tuning with continuous answer search for relation extraction

被引:4
作者
Chen, Yang [1 ]
Shi, Bowen [2 ]
Xu, Ke [1 ]
机构
[1] Beihang Univ, State Key Lab Software Dev Environm, Beijing 100191, Peoples R China
[2] Commun Univ China, Sch Journalism, Beijing 100024, Peoples R China
关键词
Prompt tuning; Relation extraction; Few-shot learning; Pretrained language model;
D O I
10.1016/j.ins.2023.120060
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Tremendous progress has been made in the development of fine-tuned pretrained language models (PLMs) that achieve outstanding results on almost all natural language processing (NLP) tasks. Further stimulation of rich knowledge distribution within PLMs can be achieved through additional prompts for fine-tuning, namely, prompt tuning. Generally, prompt engineering involves prompt template engineering, which is the process of searching for an appropriate template for a specific task, and answer engineering, whose objective is to seek an answer space and map it to the original task label set. Existing prompt -based methods are primarily designed manually and search for appropriate verbalization in a discrete answer space, which is insufficient and always results in suboptimal performance for complex NLP tasks such as relation extraction (RE). Therefore, we propose a novel prompt -tuning method with a continuous answer search for RE, which enables the model to find optimal answer word representations in a continuous space through gradient descent and thus fully exploit the relation semantics. To further exploit entitytype information and integrate structured knowledge into our approach, we designed and added an additional TransH-based structured knowledge constraint to the optimization procedure. We conducted comprehensive experiments on four RE benchmarks to evaluate the effectiveness of the proposed approach. The experimental results show that our approach achieves competitive or superior performance without manual answer engineering compared to existing baselines under both fully supervised and low -resource scenarios.
引用
收藏
页数:10
相关论文
共 43 条
  • [1] Prompt Tuning in Biomedical Relation Extraction
    He, Jianping
    Li, Fang
    Li, Jianfu
    Hu, Xinyue
    Nian, Yi
    Xiang, Yang
    Wang, Jingqi
    Wei, Qiang
    Li, Yiming
    Xu, Hua
    Tao, Cui
    JOURNAL OF HEALTHCARE INFORMATICS RESEARCH, 2024, 8 (02) : 206 - 224
  • [2] Prompt Tuning in Biomedical Relation Extraction
    Jianping He
    Fang Li
    Jianfu Li
    Xinyue Hu
    Yi Nian
    Yang Xiang
    Jingqi Wang
    Qiang Wei
    Yiming Li
    Hua Xu
    Cui Tao
    Journal of Healthcare Informatics Research, 2024, 8 : 206 - 224
  • [3] Judicial Text Relation Extraction Based on Prompt Tuning
    Chen, Xue
    Li, Yi
    Fan, Shuhuan
    Hou, Mengshu
    2024 2ND ASIA CONFERENCE ON COMPUTER VISION, IMAGE PROCESSING AND PATTERN RECOGNITION, CVIPPR 2024, 2024,
  • [4] GAP: A novel Generative context-Aware Prompt-tuning method for relation extraction
    Chen, Zhenbin
    Li, Zhixin
    Zeng, Yufei
    Zhang, Canlong
    Ma, Huifang
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 248
  • [5] Context-aware generative prompt tuning for relation extraction
    Liu, Xiaoyong
    Wen, Handong
    Xu, Chunlin
    Du, Zhiguo
    Li, Huihui
    Hu, Miao
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (12) : 5495 - 5508
  • [6] Relation Extraction as Open-book Examination: Retrieval-enhanced Prompt Tuning
    Chen, Xiang
    Li, Lei
    Zhang, Ningyu
    Tan, Chuanqi
    Huang, Fei
    Si, Luo
    Chen, Huajun
    PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 2443 - 2448
  • [7] APRE: Annotation-Aware Prompt-Tuning for Relation Extraction
    Wei, Chao
    Chen, Yanping
    Wang, Kai
    Qin, Yongbin
    Huang, Ruizhang
    Zheng, Qinghua
    NEURAL PROCESSING LETTERS, 2024, 56 (02)
  • [8] A prompt tuning method based on relation graphs for few-shot relation extraction
    Zhang, Zirui
    Yang, Yiyu
    Chen, Benhui
    NEURAL NETWORKS, 2025, 185
  • [9] APRE: Annotation-Aware Prompt-Tuning for Relation Extraction
    Chao Wei
    Yanping Chen
    Kai Wang
    Yongbin Qin
    Ruizhang Huang
    Qinghua Zheng
    Neural Processing Letters, 56
  • [10] BioKnowPrompt: Incorporating imprecise knowledge into prompt-tuning verbalizer with biomedical text for relation extraction
    Li, Qing
    Wang, Yichen
    You, Tao
    Lu, Yantao
    INFORMATION SCIENCES, 2022, 617 : 346 - 358