A Pre-trained Language Model for Medical Question Answering Based on Domain Adaption

被引:2
|
作者
Liu, Lang [1 ]
Ren, Junxiang [1 ]
Wu, Yuejiao [1 ]
Song, Ruilin [1 ]
Cheng, Zhen [1 ]
Wang, Sibo [1 ]
机构
[1] China Pacific Insurance Grp Co Ltd, Shanghai, Peoples R China
来源
NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT II | 2022年 / 13552卷
关键词
Question answering; DAP; TAP; Pretraining;
D O I
10.1007/978-3-031-17189-5_18
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the successful application of question answering (QA) in human-computer interaction scenarios such as chatbots and search engines, medical question answering (QA) systems have gradually attracted widespread attention, because it can not only help professionals make decisions efficiently, but also supply non-professional people advice when they are seeking useful information. However, due to the professionalism of domain knowledge, it is still hard for existing medical question answering systems to understand professional domain knowledge of medicine, which makes question answering systems unable to generate fluent and accurate answers. The goal of this paper is to train the language model on the basis of pre-training. With better language models, we can get better medical question answering models. Through the combination of DAP and TAP, the model can understand the knowledge of the medical domain and task, which helps question answering models generate smooth and accurate answers and achieve good results.
引用
收藏
页码:216 / 227
页数:12
相关论文
共 50 条
  • [1] Pre-trained Language Model for Biomedical Question Answering
    Yoon, Wonjin
    Lee, Jinhyuk
    Kim, Donghyeon
    Jeong, Minbyul
    Kang, Jaewoo
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT II, 2020, 1168 : 727 - 740
  • [2] Question-answering Forestry Pre-trained Language Model: ForestBERT
    Tan, Jingwei
    Zhang, Huaiqing
    Liu, Yang
    Yang, Jie
    Zheng, Dongping
    Linye Kexue/Scientia Silvae Sinicae, 2024, 60 (09): : 99 - 110
  • [3] Question Answering based Clinical Text Structuring Using Pre-trained Language Model
    Qiu, Jiahui
    Zhou, Yangming
    Ma, Zhiyuan
    Ruan, Tong
    Liu, Jinlin
    Sun, Jing
    2019 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2019, : 1596 - 1600
  • [4] Improving Visual Question Answering with Pre-trained Language Modeling
    Wu, Yue
    Gao, Huiyi
    Chen, Lei
    FIFTH INTERNATIONAL WORKSHOP ON PATTERN RECOGNITION, 2020, 11526
  • [5] K-AID: Enhancing Pre-trained Language Models with Domain Knowledge for Question Answering
    Sun, Fu
    Li, Feng-Lin
    Wang, Ruize
    Chen, Qianglong
    Cheng, Xingyi
    Zhang, Ji
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 4125 - 4134
  • [6] Augmenting Pre-trained Language Models with QA-Memory for Open-Domain Question Answering
    Chen, Wenhu
    Verga, Pat
    de Jong, Michiel
    Wieting, John
    Cohen, William W.
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1597 - 1610
  • [7] Multi-Hop Knowledge Base Question Answering with Pre-Trained Language Model Feature Enhancement
    Wei, Qianqiang
    Zhao, Shuliang
    Lu, Danqi
    Jia, Xiaowen
    Yang, Shilong
    Computer Engineering and Applications, 2024, 60 (22) : 184 - 196
  • [8] Schema matching based on energy domain pre-trained language model
    Pan Z.
    Yang M.
    Monti A.
    Energy Informatics, 2023, 6 (Suppl 1)
  • [9] InA: Inhibition Adaption on pre-trained language models
    Kang, Cheng
    Prokop, Jindrich
    Tong, Lei
    Zhou, Huiyu
    Hu, Yong
    Novak, Daniel
    NEURAL NETWORKS, 2024, 178
  • [10] An empirical study of pre-trained language models in simple knowledge graph question answering
    Hu, Nan
    Wu, Yike
    Qi, Guilin
    Min, Dehai
    Chen, Jiaoyan
    Pan, Jeff Z.
    Ali, Zafar
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2023, 26 (05): : 2855 - 2886