Question Answering based Clinical Text Structuring Using Pre-trained Language Model

被引:0
|
作者
Qiu, Jiahui [1 ]
Zhou, Yangming [1 ]
Ma, Zhiyuan [1 ]
Ruan, Tong [1 ]
Liu, Jinlin [1 ]
Sun, Jing [2 ]
机构
[1] East China Univ Sci & Technol, Sch Informat Sci & Engn, Shanghai 200237, Peoples R China
[2] Shanghai Jiao Tong Univ, Ruijin Hosp, Sch Med, Shanghai 200025, Peoples R China
来源
2019 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM) | 2019年
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Question answering; Clinical text structuring; Pre-trained language model; Electronic health records;
D O I
10.1109/bibm47256.2019.8983142
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Clinical text structuring is a critical and fundamental task for clinical research. Traditional methods such as task-specific end-to-end models and pipeline models usually suffer from the lack of dataset and error propagation. In this paper, we present a question answering based clinical text structuring (QA-CTS) task to unify different specific CTS tasks and make dataset shareable. A novel model that aims to introduce domain-specific features (e.g., clinical named entity information) into pre-trained language model is also proposed for QA-CTS task. Experimental results on Chinese pathology reports collected from Ruijing Hospital demonstrate our presented QA-CTS task is very effective to improve the performance on specific tasks. Our proposed model also competes favorably with strong baseline models in specific tasks.
引用
收藏
页码:1596 / 1600
页数:5
相关论文
共 50 条
  • [21] Fine-Grained Sentiment-Controlled Text Generation Approach Based on Pre-Trained Language Model
    Zhu, Linan
    Xu, Yifei
    Zhu, Zhechao
    Bao, Yinwei
    Kong, Xiangjie
    APPLIED SCIENCES-BASEL, 2023, 13 (01):
  • [22] BioHanBERT: A Hanzi-aware Pre-trained Language Model for Chinese Biomedical Text Mining
    Wang, Xiaosu
    Xiong, Yun
    Niu, Hao
    Yue, Jingwen
    Zhu, Yangyong
    Yu, Philip S.
    2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021), 2021, : 1415 - 1420
  • [23] Electric Power Audit Text Classification With Multi-Grained Pre-Trained Language Model
    Meng, Qinglin
    Song, Yan
    Mu, Jian
    Lv, Yuanxu
    Yang, Jiachen
    Xu, Liang
    Zhao, Jin
    Ma, Junwei
    Yao, Wei
    Wang, Rui
    Xiao, Maoxiang
    Meng, Qingyu
    IEEE ACCESS, 2023, 11 : 13510 - 13518
  • [24] Using Pre-trained Language Model to Enhance Active Learning for Sentence Matching
    Bai, Guirong
    He, Shizhu
    Liu, Kang
    Zhao, Jun
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2022, 21 (02)
  • [25] NMT Enhancement based on Knowledge Graph Mining with Pre-trained Language Model
    Yang, Hao
    Qin, Ying
    Deng, Yao
    Wang, Minghan
    2020 22ND INTERNATIONAL CONFERENCE ON ADVANCED COMMUNICATION TECHNOLOGY (ICACT): DIGITAL SECURITY GLOBAL AGENDA FOR SAFE SOCIETY!, 2020, : 185 - 189
  • [26] Pre-trained Language Model-based Retrieval and Ranking forWeb Search
    Zou, Lixin
    Lu, Weixue
    Liu, Yiding
    Cai, Hengyi
    Chu, Xiaokai
    Ma, Dehong
    Shi, Daiting
    Sun, Yu
    Cheng, Zhicong
    Gu, Simiu
    Wang, Shuaiqiang
    Yin, Dawei
    ACM TRANSACTIONS ON THE WEB, 2023, 17 (01)
  • [27] Leveraging Pre-trained Language Model for Speech Sentiment Analysis
    Shon, Suwon
    Brusco, Pablo
    Pan, Jing
    Han, Kyu J.
    Watanabe, Shinji
    INTERSPEECH 2021, 2021, : 3420 - 3424
  • [28] Idiom Cloze Algorithm Integrating with Pre-trained Language Model
    Ju S.-G.
    Huang F.-Y.
    Sun J.-P.
    Ruan Jian Xue Bao/Journal of Software, 2022, 33 (10): : 3793 - 3805
  • [29] Bringing legal knowledge to the public by constructing a legal question bank using large-scale pre-trained language model
    Yuan, Mingruo
    Kao, Ben
    Wu, Tien-Hsuan
    Cheung, Michael M. K.
    Chan, Henry W. H.
    Cheung, Anne S. Y.
    Chan, Felix W. H.
    Chen, Yongxi
    ARTIFICIAL INTELLIGENCE AND LAW, 2024, 32 (03) : 769 - 805
  • [30] A pre-trained language model for emergency department intervention prediction using routine physiological data and clinical narratives
    Huang, Ting-Yun
    Chong, Chee-Fah
    Lin, Heng-Yu
    Chen, Tzu-Ying
    Chang, Yung-Chun
    Lin, Ming-Chin
    INTERNATIONAL JOURNAL OF MEDICAL INFORMATICS, 2024, 191