Medical named entity recognition of Chinese electronic medical records based on stacked Bidirectional Long Short-Term Memory

被引:3
作者
Zhu, Zhichao [1 ]
Li, Jianqiang [1 ]
Zhao, Qing [1 ]
Wei, Yu-Chih [2 ]
Jia, Yanhe [3 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
[2] Natl Taipei Univ Technol, Taipei, Taiwan
[3] Beijng Informat Sci & Technol Univ, Sch Econ & Management, Beijing, Peoples R China
来源
2021 IEEE 45TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE (COMPSAC 2021) | 2021年
关键词
Electronic medical record (EMR); Named entity recognition (NER); Bidirectional Encoder Representation from Transformers (BERT); Stacked Bidirectional Long Short-Term Memory (BILSTM);
D O I
10.1109/COMPSAC51774.2021.00293
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The wide adoption of electronic medical record (EMR) systems causes rapid growth of medical and clinical data. It makes the medical named entity recognition (NER) technologies become critical to find useful patient information in the medical dataset. However, the medical terminologies usually have the characteristics of inherent complexity and ambiguity, it is difficult to capture context-dependency representations by supervision signal from a simple single layer structure model. In order to address this problem, this paper proposes a hybrid model based on stacked Bidirectional Long Short-Term Memory (BILSTM) for medical named entity recognition, which we call BSBC (BERT combined with stacked BILSTM and CRF). First, we use Bidirectional Encoder Representation from Transformers (BERT) to perform unsupervised learning on an unlabeled dataset to obtain character-level embeddings. Then, stacked BILSTM is utilized to obtain context-dependency representations through the multi hidden layers structure. Finally, Conditional Random Field (CRF) is used to predict sequence tags. The experiment results show that our method significantly outperforms the baseline methods, it serves as a strong alternative approach compared with traditional methods.
引用
收藏
页码:1930 / 1935
页数:6
相关论文
empty
未找到相关数据