Medical Entity Recognition Based on BiLSTM with Knowledge Graph and Attention Mechanism

被引:4
作者
Wang, Qiaoling [1 ]
Liu, Yu [1 ]
Gu, Jinguang [1 ]
Fu, Haidong [1 ]
机构
[1] Wuhan Univ Sci & Technol, Dept Comp Sci & Technol, Wuhan, Peoples R China
来源
2021 4TH INTERNATIONAL CONFERENCE ON INTELLIGENT AUTONOMOUS SYSTEMS (ICOIAS 2021) | 2021年
基金
中国国家自然科学基金;
关键词
Internet medical consultation text; named entity recognition; deep neural network; attention mechanism; knowledge graph;
D O I
10.1109/ICoIAS53694.2021.00035
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Considering the characteristics of non-standard expression in Internet text, some information extraction models have employed knowledge graphs of different areas to improve entity recognition performance. However, these existing models merely use background knowledge in a single step, which leads to insufficient use of additional clues in the knowledge graph. Aiming at the above problems, this paper proposes a new entity recognition model based on BiLSTM with knowledge graph and attention mechanism and applies the model to extract medical entities from Internet medical consultation text. The model extracts the conceptual features and candidate knowledge sets from the medical knowledge graph. The encoded concept features are embedded into the input of BiLSTM to enhance the semantic expression of words. Furthermore, the candidate knowledge sets are integrated into the attention mechanism to capture the important information in the knowledge graph and context. To verify the effectiveness of the model, we extracted medical named entities from the consultation text of Haodafu. Experimental results show that the proposed model can effectively improve the performance of entity recognition tasks.
引用
收藏
页码:149 / 157
页数:9
相关论文
共 50 条
  • [31] Named Entity Recognition of BERT-BiLSTM-CRF Combined with Self-attention
    Xu, Lei
    Li, Shuang
    Wang, Yuchen
    Xu, Lizhen
    WEB INFORMATION SYSTEMS AND APPLICATIONS (WISA 2021), 2021, 12999 : 556 - 564
  • [32] Cybersecurity entity recognition model for IoT via hierarchical attention mechanism
    Wu, Chunwang
    Liu, Jiayong
    Huang, Cheng
    Li, Linxia
    INTERNATIONAL JOURNAL OF MODERN PHYSICS C, 2025,
  • [33] An attention mechanism and residual network based knowledge graph-enhanced recommender system
    Li, Weisheng
    Zhong, Hao
    Zhou, Junming
    Chang, Chao
    Lin, Ronghua
    Tang, Yong
    KNOWLEDGE-BASED SYSTEMS, 2024, 299
  • [34] KGNER: Improving Chinese Named Entity Recognition by BERT Infused with the Knowledge Graph
    Hu, Weiwei
    He, Liang
    Ma, Hanhan
    Wang, Kai
    Xiao, Jingfeng
    APPLIED SCIENCES-BASEL, 2022, 12 (15):
  • [35] Partial Discharge Recognition of Transformers Based on Data Augmentation and CNN-BiLSTM-Attention Mechanism
    Fu, Zhongjun
    Wang, Yuhui
    Zhou, Lei
    Li, Keyang
    Rao, Hang
    ELECTRONICS, 2025, 14 (01):
  • [36] Reinforcement knowledge graph reasoning based on dual agents and attention mechanism
    Yang, Xu-Hua
    Wang, Tao
    Gan, Ji-Song
    Gao, Liang-Yu
    Ma, Gang-Feng
    Zhou, Yan-Bo
    APPLIED INTELLIGENCE, 2025, 55 (06)
  • [37] Disease Prediction Model Based on BiLSTM and Attention Mechanism
    Yang, Yang
    Zheng, Xiangwei
    Ji, Cun
    2019 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2019, : 1141 - 1148
  • [38] Graph Attention Mechanism with Cardinality Preservation for Knowledge Graph Completion
    Ding, Cong
    Wei, Xiao
    Chen, Yongqi
    Zhao, Rui
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2021, 12815 : 479 - 490
  • [39] Text Classification for Fault Knowledge Graph Construction Based on CNN-BiLSTM
    Chen, Tianchang
    Lu, Ningyun
    Lei, Xue
    Ma, Leiming
    Tang, Hao
    Jiang, Bin
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 2727 - 2732
  • [40] Named Entity Recognition using Knowledge Graph Embeddings and DistilBERT
    Mehta, Shreyansh
    Radke, Mansi A.
    Sunkle, Sagar
    2021 5TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING AND INFORMATION RETRIEVAL, NLPIR 2021, 2021, : 146 - 150