Application of Entity-BERT model based on neuroscience and brain-like cognition in electronic medical record entity recognition

被引:3
|
作者
Lu, Weijia [1 ,2 ]
Jiang, Jiehui [3 ]
Shi, Yaxiang [4 ]
Zhong, Xiaowei [5 ]
Gu, Jun [6 ]
Huangfu, Lixia [7 ]
Gong, Ming [7 ]
机构
[1] Nantong Univ, Affiliated Hosp, Sci & Technol Dept, Nantong, Peoples R China
[2] Jianghai Hosp Nantong Sutong Sci & Technol Pk, Dept Internal Med, Nantong, Peoples R China
[3] Shanghai Univ, Dept Biomed Engn, Shanghai, Peoples R China
[4] Southeast Univ, Zhongda Hosp, Network Informat Ctr, Nanjing, Peoples R China
[5] China Univ Min & Technol, Sch Informat & Control Engn, Xuzhou, Peoples R China
[6] Nantong Univ, Affiliated Hosp, Dept Resp, Nantong, Peoples R China
[7] Nantong Univ, Informat Ctr Dept, Affiliated Hosp, Nantong, Peoples R China
关键词
BERT; LSTM; cross attention; entity recognition; electronic medical records;
D O I
10.3389/fnins.2023.1259652
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
IntroductionIn the medical field, electronic medical records contain a large amount of textual information, and the unstructured nature of this information makes data extraction and analysis challenging. Therefore, automatic extraction of entity information from electronic medical records has become a significant issue in the healthcare domain.MethodsTo address this problem, this paper proposes a deep learning-based entity information extraction model called Entity-BERT. The model aims to leverage the powerful feature extraction capabilities of deep learning and the pre-training language representation learning of BERT(Bidirectional Encoder Representations from Transformers), enabling it to automatically learn and recognize various entity types in medical electronic records, including medical terminologies, disease names, drug information, and more, providing more effective support for medical research and clinical practices. The Entity-BERT model utilizes a multi-layer neural network and cross-attention mechanism to process and fuse information at different levels and types, resembling the hierarchical and distributed processing of the human brain. Additionally, the model employs pre-trained language and sequence models to process and learn textual data, sharing similarities with the language processing and semantic understanding of the human brain. Furthermore, the Entity-BERT model can capture contextual information and long-term dependencies, combining the cross-attention mechanism to handle the complex and diverse language expressions in electronic medical records, resembling the information processing method of the human brain in many aspects. Additionally, exploring how to utilize competitive learning, adaptive regulation, and synaptic plasticity to optimize the model's prediction results, automatically adjust its parameters, and achieve adaptive learning and dynamic adjustments from the perspective of neuroscience and brain-like cognition is of interest.Results and discussionExperimental results demonstrate that the Entity-BERT model achieves outstanding performance in entity recognition tasks within electronic medical records, surpassing other existing entity recognition models. This research not only provides more efficient and accurate natural language processing technology for the medical and health field but also introduces new ideas and directions for the design and optimization of deep learning models.
引用
收藏
页数:19
相关论文
共 24 条
  • [1] Chinese mineral named entity recognition based on BERT model
    Yu, Yuqing
    Wang, Yuzhu
    Mua, Jingqin
    Li, Wei
    Jiao, Shoutao
    Wang, Zhenhua
    Lv, Pengfei
    Zhu, Yueqin
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 206
  • [2] Research on Named Entity Recognition Method Based on BERT Model
    Xie, Shaopeng
    2024 IEEE 10TH INTERNATIONAL CONFERENCE ON BIG DATA COMPUTING SERVICE AND MACHINE LEARNING APPLICATIONS, BIGDATASERVICE 2024, 2024, : 92 - 96
  • [3] Chinese Named Entity Recognition of Geological News Based on BERT Model
    Huang, Chao
    Wang, Yuzhu
    Yu, Yuqing
    Hao, Yujia
    Liu, Yuebin
    Zhao, Xiujian
    APPLIED SCIENCES-BASEL, 2022, 12 (15):
  • [4] Electronic Medical Record Recommendation System Based on Deep Embedding Learning with Named Entity Recognition
    Zheng, Yuqian
    Yan, Xu
    Cao, Xin
    Ai, Chunhui
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT VII, 2023, 14260 : 298 - 309
  • [5] Text Summarization based Named Entity Recognition for Certain Application using BERT
    Tummala, Indira Priyadarshini
    2024 SECOND INTERNATIONAL CONFERENCE ON INTELLIGENT CYBER PHYSICAL SYSTEMS AND INTERNET OF THINGS, ICOICI 2024, 2024, : 1136 - 1141
  • [6] Chinese Named Entity Recognition Based on BERT and Lightweight Feature Extraction Model
    Yang, Ruisen
    Gan, Yong
    Zhang, Chenfang
    INFORMATION, 2022, 13 (11)
  • [7] A Named Entity Recognition Model for Manufacturing Process Based on the BERT Language Model Scheme
    Shrivastava, Manu
    Seri, Kota
    Wagatsuma, Hiroaki
    SOCIAL ROBOTICS, ICSR 2022, PT I, 2022, 13817 : 576 - 587
  • [8] Named Entity Recognition for Chinese Electronic Medical Records Based on Multitask and Transfer Learning
    Guo, Wenming
    Lu, Junda
    Han, Fang
    IEEE ACCESS, 2022, 10 : 77375 - 77382
  • [9] Chinese Medical Named Entity Recognition based on Expert Knowledge and Fine-tuning Bert
    Zhang, Bofeng
    Yao, Xiuhong
    Li, Haiyan
    Aini, Mirensha
    2023 IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH, ICKG, 2023, : 84 - 90
  • [10] ChineseCTRE: A Model for Geographical Named Entity Recognition and Correction Based on Deep Neural Networks and the BERT Model
    Zhang, Wei
    Meng, Jingtao
    Wan, Jianhua
    Zhang, Chengkun
    Zhang, Jiajun
    Wang, Yuanyuan
    Xu, Liuchang
    Li, Fei
    ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION, 2023, 12 (10)