KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation Extraction

被引:186
作者
Chen, Xiang [1 ]
Zhang, Ningyu [1 ]
Xie, Xin [1 ]
Deng, Shumin [1 ]
Yao, Yunzhi [1 ]
Tan, Chuanqi [2 ]
Huang, Fei [2 ]
Si, Luo [2 ]
Chen, Huajun [1 ]
机构
[1] Zhejiang Univ, AZFT Joint Lab Knowledge Engine, Hangzhou Innovat Ctr, Hangzhou, Zhejiang, Peoples R China
[2] Alibaba Grp, Hangzhou, Zhejiang, Peoples R China
来源
PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22) | 2022年
基金
国家重点研发计划;
关键词
Relation Extraction; Prompt-tuning; Knowledge-aware;
D O I
10.1145/3485447.3511998
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, prompt-tuning has achieved promising results for specific few-shot classification tasks. The core idea of prompt-tuning is to insert text pieces (i.e., templates) into the input and transform a classification task into a masked language modeling problem. However, for relation extraction, determining an appropriate prompt template requires domain expertise, and it is cumbersome and timeconsuming to obtain a suitable label word. Furthermore, there exists abundant semantic and prior knowledge among the relation labels that cannot be ignored. To this end, we focus on incorporating knowledge among relation labels into prompt-tuning for relation extraction and propose a Knowledge-aware Prompt-tuning approach with synergistic optimization (KnowPrompt). Specifically, we inject latent knowledge contained in relation labels into prompt construction with learnable virtual type words and answer words. Then, we synergistically optimize their representation with structured constraints. Extensive experimental results on five datasets with standard and low-resource settings demonstrate the effectiveness of our approach. Our code and datasets are available in GitHub1 for reproducibility.
引用
收藏
页码:2778 / 2788
页数:11
相关论文
共 68 条
  • [11] Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
  • [12] Ding Ning, 2021, ABS210810604 CORR
  • [13] Ding Ning, 2021, ARXIV210810604
  • [14] Gao Tianyu, 2020, P AAAI 2020
  • [15] Gao Tianyu, 2021, P ACL
  • [16] Gao TR, 2019, AAAI CONF ARTIF INTE, P3630
  • [17] Gu Y., 2021, ABS210904332 CORR
  • [18] Guo ZJ, 2020, PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P3651
  • [19] Guo Zhijiang, 2019, P ACL 2019
  • [20] Hambardzumyan Karen, 2021, P ACL IJCNLP 2021