PTCAS: Prompt tuning with continuous answer search for relation extraction

被引:4
作者
Chen, Yang [1 ]
Shi, Bowen [2 ]
Xu, Ke [1 ]
机构
[1] Beihang Univ, State Key Lab Software Dev Environm, Beijing 100191, Peoples R China
[2] Commun Univ China, Sch Journalism, Beijing 100024, Peoples R China
关键词
Prompt tuning; Relation extraction; Few-shot learning; Pretrained language model;
D O I
10.1016/j.ins.2023.120060
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Tremendous progress has been made in the development of fine-tuned pretrained language models (PLMs) that achieve outstanding results on almost all natural language processing (NLP) tasks. Further stimulation of rich knowledge distribution within PLMs can be achieved through additional prompts for fine-tuning, namely, prompt tuning. Generally, prompt engineering involves prompt template engineering, which is the process of searching for an appropriate template for a specific task, and answer engineering, whose objective is to seek an answer space and map it to the original task label set. Existing prompt -based methods are primarily designed manually and search for appropriate verbalization in a discrete answer space, which is insufficient and always results in suboptimal performance for complex NLP tasks such as relation extraction (RE). Therefore, we propose a novel prompt -tuning method with a continuous answer search for RE, which enables the model to find optimal answer word representations in a continuous space through gradient descent and thus fully exploit the relation semantics. To further exploit entitytype information and integrate structured knowledge into our approach, we designed and added an additional TransH-based structured knowledge constraint to the optimization procedure. We conducted comprehensive experiments on four RE benchmarks to evaluate the effectiveness of the proposed approach. The experimental results show that our approach achieves competitive or superior performance without manual answer engineering compared to existing baselines under both fully supervised and low -resource scenarios.
引用
收藏
页数:10
相关论文
共 43 条
  • [31] Integrating graph convolutional networks to enhance prompt learning for biomedical relation extraction
    Guo, Bocheng
    Meng, Jiana
    Zhao, Di
    Jia, Xiangxing
    Chu, Yonghe
    Lin, Hongfei
    JOURNAL OF BIOMEDICAL INFORMATICS, 2024, 157
  • [32] Few-Shot Relation Extraction Through Prompt With Relation Information and Multi-Level Contrastive Learning
    Dong, Ye
    Yang, Rong
    Liu, Junbao
    Qin, Xizhong
    IEEE ACCESS, 2024, 12 : 123352 - 123361
  • [33] Clinical concept and relation extraction using prompt-based machine reading comprehension
    Peng, Cheng
    Yang, Xi
    Yu, Zehao
    Bian, Jiang
    Hogan, William R.
    Wu, Yonghui
    JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, 2023, 30 (09) : 1486 - 1493
  • [34] PURE: a Prompt-based framework with dynamic Update mechanism for educational Relation Extraction
    Cui, Xiaohui
    Yang, Yu
    Li, Dongmei
    Cui, Jinman
    Qu, Xiaolong
    Song, Chao
    Liu, Haoran
    Ke, Siyuan
    COMPLEX & INTELLIGENT SYSTEMS, 2025, 11 (01)
  • [35] A PROMPT-BASED METHOD WITH MULTI-VIEW OPTIMIZATION FOR OPEN RELATION EXTRACTION
    Zhang, Ying
    Dang, Depeng
    Wang, Ning
    Gao, Hu
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2024), 2024, : 10911 - 10915
  • [36] REA: Refine-Estimate-Answer Prompting for Zero-Shot Relation Extraction
    Layegh, Amirhossein
    Payberah, Amir H.
    Matskin, Mihhail
    NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS, PT I, NLDB 2024, 2024, 14762 : 301 - 316
  • [37] Leveraging Prompt and Top-K Predictions with ChatGPT Data Augmentation for Improved Relation Extraction
    Feng, Ping
    Wu, Hang
    Yang, Ziqian
    Wang, Yunyi
    Ouyang, Dantong
    APPLIED SCIENCES-BASEL, 2023, 13 (23):
  • [38] Think Rationally about What You See: Continuous Rationale Extraction for Relation Extraction
    Hu, Xuming
    Hong, Zhaochen
    Zhang, Chenwei
    King, Irwin
    Yu, Philip S.
    PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, : 2436 - 2440
  • [39] Is Prompt the Future? A Survey of Evolution of Relation Extraction Approach Using Deep Learning and Big Data
    Zhu, Zhen
    Wang, Liting
    Gu, Dongmei
    Wu, Hong
    Janfada, Behrooz
    Minaei-Bidgoli, Behrouz
    INTERNATIONAL JOURNAL OF INFORMATION TECHNOLOGIES AND SYSTEMS APPROACH, 2023, 16 (01) : 1172 - 1189
  • [40] Investigation of BERT Model on Biomedical Relation Extraction Based on Revised Fine-tuning Mechanism
    Su, Peng
    Vijay-Shanker, K.
    2020 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE, 2020, : 2522 - 2529