Entity recognition in Chinese clinical text using attention-based CNN-LSTM-CRF

被引:30
|
作者
Tang, Buzhou [1 ]
Wang, Xiaolong [1 ]
Yan, Jun [2 ]
Chen, Qingcai [1 ]
机构
[1] Harbin Inst Technol, Key Lab Network Oriented Intelligent Computat, Shenzhen 518055, Peoples R China
[2] Yidu Cloud Beijing Technol Co Ltd, Beijing 100191, Peoples R China
关键词
Chinese clinical entity recognition; Neural network; Convolutional neural network; Long-short term memory; Conditional random field;
D O I
10.1186/s12911-019-0787-y
中图分类号
R-058 [];
学科分类号
摘要
BackgroundClinical entity recognition as a fundamental task of clinical text processing has been attracted a great deal of attention during the last decade. However, most studies focus on clinical text in English rather than other languages. Recently, a few researchers have began to study entity recognition in Chinese clinical text.MethodsIn this paper, a novel deep neural network, called attention-based CNN-LSTM-CRF, is proposed to recognize entities in Chinese clinical text. Attention-based CNN-LSTM-CRF is an extension of LSTM-CRF by introducing a CNN (convolutional neural network) layer after the input layer to capture local context information of words of interest and an attention layer before the CRF layer to select relevant words in the same sentence.ResultsIn order to evaluate the proposed method, we compare it with other two currently popular methods, CRF (conditional random field) and LSTM-CRF, on two benchmark datasets. One of the datasets is publically available and only contains contiguous clinical entities, and the other one is constructed by us and contains contiguous and discontiguous clinical entities. Experimental results show that attention-based CNN-LSTM-CRF outperforms CRF and LSTM-CRF.ConclusionsCNN and attention mechanism are individually beneficial to LSTM-CRF-based Chinese clinical entity recognition system, no matter whether contiguous clinical entities are considered. The conribution of attention mechanism is greater than CNN.
引用
收藏
页数:9
相关论文
共 50 条
  • [21] Chinese Named Entity Recognition in Power Domain Based on Bi-LSTM-CRF
    Zhao, Zhenqiang
    Chen, Zhenyu
    Liu, Jinbo
    Huang, Yunhao
    Gao, Xingyu
    Di, Fangchun
    Li, Lixin
    Ji, Xiaohui
    2019 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND PATTERN RECOGNITION (AIPR 2019), 2019, : 176 - 180
  • [22] Recognition of Ironic Sentences in Twitter using Attention-Based LSTM
    Martini, Andrianarisoa Tojo
    Farrukh, Makhmudov
    Ge, Hongwei
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2018, 9 (08) : 7 - 11
  • [23] Attention-based Text Recognition in the Wild
    Yan, Zhi-Chen
    Yu, Stephanie A.
    PROCEEDINGS OF THE 1ST INTERNATIONAL CONFERENCE ON DEEP LEARNING THEORY AND APPLICATIONS (DELTA), 2020, : 42 - 49
  • [24] Advancing human action recognition: A hybrid approach using attention-based LSTM and 3D CNN
    Saoudi, El Mehdi
    Jaafari, Jaafar
    Andaloussi, Said Jai
    SCIENTIFIC AFRICAN, 2023, 21
  • [25] Named Entity Recognition of Chinese Agricultural Text Based on Attention Mechanism
    Zhao, Pengfei
    Zhao, Chunjiang
    Wu, Huarui
    Wang, Wei
    Nongye Jixie Xuebao/Transactions of the Chinese Society for Agricultural Machinery, 2021, 52 (01): : 185 - 192
  • [26] An attention-based deep learning model for clinical named entity recognition of Chinese electronic medical records
    Luqi Li
    Jie Zhao
    Li Hou
    Yunkai Zhai
    Jinming Shi
    Fangfang Cui
    BMC Medical Informatics and Decision Making, 19
  • [27] An attention-based deep learning model for clinical named entity recognition of Chinese electronic medical records
    Li, Luqi
    Zhao, Jie
    Hou, Li
    Zhai, Yunkai
    Shi, Jinming
    Cui, Fangfang
    BMC MEDICAL INFORMATICS AND DECISION MAKING, 2019, 19 (01)
  • [28] Post Text Processing of Chinese Speech Recognition Based on Bidirectional LSTM Networks and CRF
    Li Yang
    Li, Ying
    Wang, Jin
    Tang, Zhuo
    ELECTRONICS, 2019, 8 (11) : 1249
  • [29] Portuguese Named Entity Recognition Using LSTM-CRF
    Quinta de Castro, Pedro Vitor
    Felipe da Silva, Nadia Felix
    Soares, Anderson da Silva
    COMPUTATIONAL PROCESSING OF THE PORTUGUESE LANGUAGE, PROPOR 2018, 2018, 11122 : 83 - 92
  • [30] Attention-Based LSTM with Filter Mechanism for Entity Relation Classification
    Jin, Yanliang
    Wu, Dijia
    Guo, Weisi
    SYMMETRY-BASEL, 2020, 12 (10): : 1 - 16