Medical QA Oriented Multi-Task Learning Model for Question Intent Classification and Named Entity Recognition

被引:3
|
作者
Tohti, Turdi [1 ]
Abdurxit, Mamatjan [1 ]
Hamdulla, Askar [1 ]
机构
[1] Xinjiang Univ, Sch Informat Sci & Engn, Xinjiang Key Lab Signal Detect & Proc, Urumqi 830017, Peoples R China
基金
中国国家自然科学基金;
关键词
multi-task learning; named entity recognition; intent classification; ALBERT; deep learning;
D O I
10.3390/info13120581
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Intent classification and named entity recognition of medical questions are two key subtasks of the natural language understanding module in the question answering system. Most existing methods usually treat medical queries intent classification and named entity recognition as two separate tasks, ignoring the close relationship between the two tasks. In order to optimize the effect of medical queries intent classification and named entity recognition tasks, a multi-task learning model based on ALBERT-BILSTM is proposed for intent classification and named entity recognition of Chinese online medical questions. The multi-task learning model in this paper makes use of encoder parameter sharing, which enables the model's underlying network to take into account both named entity recognition and intent classification features. The model learns the shared information between the two tasks while maintaining its unique characteristics during the decoding phase. The ALBERT pre-training language model is used to obtain word vectors containing semantic information and the bidirectional LSTM network is used for training. A comparative experiment of different models was conducted on Chinese medical questions dataset. Experimental results show that the proposed multi-task learning method outperforms the benchmark method in terms of precision, recall and F-1 value. Compared with the single-task model, the generalization ability of the model has been improved.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Unified Transformer Multi-Task Learning for Intent Classification With Entity Recognition
    Benayas Alamos, Alberto Jose
    Hashempou, Reyhaneh
    Rumble, Damian
    Jameel, Shoaib
    De Amorim, Renato Cordeiro
    IEEE ACCESS, 2021, 9 : 147306 - 147314
  • [2] Chinese Named Entity Recognition Model Based on Multi-Task Learning
    Fang, Qin
    Li, Yane
    Feng, Hailin
    Ruan, Yaoping
    APPLIED SCIENCES-BASEL, 2023, 13 (08):
  • [3] A Neural Multi-Task Learning Framework to Jointly Model Medical Named Entity Recognition and Normalization
    Zhao, Sendong
    Liu, Ting
    Zhao, Sicheng
    Wang, Fei
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 817 - 824
  • [4] An attention-based multi-task model for named entity recognition and intent analysis of Chinese online medical questions
    Wu, Chaochen
    Luo, Guan
    Guo, Chao
    Ren, Yin
    Zheng, Anni
    Yang, Cheng
    JOURNAL OF BIOMEDICAL INFORMATICS, 2020, 108 (108)
  • [5] Leveraging Multi-task Learning for Biomedical Named Entity Recognition
    Mehmood, Tahir
    Gerevini, Alfonso
    Lavelli, Alberto
    Serina, Ivan
    ADVANCES IN ARTIFICIAL INTELLIGENCE, AI*IA 2019, 2019, 11946 : 431 - 444
  • [6] Biomedical Named Entity Recognition Based on Multi-task Learning
    Zhao, Hui
    Zhao, Di
    Meng, Jiana
    Su, Wen
    Mu, Wenxuan
    HEALTH INFORMATION PROCESSING, CHIP 2023, 2023, 1993 : 51 - 65
  • [7] MTAAL: Multi-Task Adversarial Active Learning for Medical Named Entity Recognition and Normalization
    Zhou, Baohang
    Cai, Xiangrui
    Zhang, Ying
    Guo, Wenya
    Yuan, Xiaojie
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14586 - 14593
  • [8] Adversarial Multi-task Learning for Efficient Chinese Named Entity Recognition
    Yan, Yibo
    Zhu, Peng
    Cheng, Dawei
    Yang, Fangzhou
    Luo, Yifeng
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (07)
  • [9] Multi-Task Learning for Chemical Named Entity Recognition with Chemical Compound Paraphrasing
    Watanabe, Taiki
    Tamura, Akihiro
    Ninomiya, Takashi
    Makino, Takuya
    Iwakura, Tomoya
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 6244 - 6249
  • [10] Research on Named Entity Recognition Based on Multi-Task Learning and Biaffine Mechanism
    Gao, Wenchao
    Li, Yu
    Guan, Xiaole
    Chen, Shiyu
    Zhao, Shanshan
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022