Medication Recommendation Based on a Knowledge-enhanced Pre-training Model

被引:3
|
作者
Wang, Mengzhen [1 ]
Chen, Jianhui [2 ]
Lin, Shaofu [1 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Coll Software Engn, Beijing, Peoples R China
[2] Beijing Univ Technol, Beijing Int Collaborat Base Brain Informat & Wisd, Beijing, Peoples R China
来源
PROCEEDINGS OF 2021 IEEE/WIC/ACM INTERNATIONAL CONFERENCE ON WEB INTELLIGENCE AND INTELLIGENT AGENT TECHNOLOGY WORKSHOPS AND SPECIAL SESSIONS: (WI-IAT WORKSHOP/SPECIAL SESSION 2021) | 2021年
关键词
Electronic medical record; Graph Attention Network; Pre-training model; Medication recommendation; NEURAL-NETWORK;
D O I
10.1145/3498851.3498968
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
More and more attention has been paid to electronic medical record (EMR)-based auxiliary diagnosis and treatment, in which medication recommendation is an important research direction. The existing medication recommendation models mainly depend on the data of patients, diagnosis and medications. However, the insufficient amount of clinical data with temporal dependencies becomes a major obstacle. This paper proposes a new knowledge-enhanced pre-training model for medication recommendation. On the one hand, the classification knowledge in diagnostic codes and drug codes is encoded by Graph Attention Network and fused into the clinical data for expanding the data content. On the other hand, a large number of single visit data of EMR are used to create the pre-trained visit model by a modified BERT for expanding the data scale. The experimental results on EMR data from more than 2,000 medical and health institutions in Hainan, China show that the fusion of classification knowledge and pre-training model can effectively improve the accuracy of medication recommendation.
引用
收藏
页码:290 / 294
页数:5
相关论文
共 50 条
  • [41] One-Shot Simple Pattern Detection without Pre-Training and Gradient-Based Strategy
    Su, Jun
    He, Wei
    Wang, Yingguan
    Ma, Runze
    SENSORS, 2023, 23 (22)
  • [42] Smart contract vulnerability detection method based on pre-training and novel timing graph neural network
    Zhuang, Yuan
    Fan, Zekai
    Wang, Cheng
    Sun, Jianguo
    Li, Yaolin
    Tongxin Xuebao/Journal on Communications, 2024, 45 (09): : 101 - 114
  • [43] Remote Sensing Image Vehicle Detection Based on Pre-Training and Random-Initialized Fusion Network
    Liu, Hongkun
    Ding, Qichen
    Hu, Zican
    Chen, Xueyun
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [44] A fast and efficient pre-training method based on layer-by-layer maximum discrimination for deep neural networks
    Seyyedsalehi, Seyyede Zohreh
    Seyyedsalehi, Seyyed Ali
    NEUROCOMPUTING, 2015, 168 : 669 - 680
  • [45] Pre-training Techniques for Improving Text-to-Speech Synthesis by Automatic Speech Recognition Based Data Enhancement
    Liu, Yazhu
    Xue, Shaofei
    Tang, Jian
    MAN-MACHINE SPEECH COMMUNICATION, NCMMSC 2022, 2023, 1765 : 162 - 172
  • [46] Evaluation model of classroom teaching quality based on improved RVM algorithm and knowledge recommendation
    Sun Qianna
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2021, 40 (02) : 2457 - 2467
  • [47] Chinese mineral exploration named entity recognition for literature mining by fusing multi-features with an enhancement domain pre-training model
    Wu, Qirui
    Liu, Zhihao
    Miao, Tian
    Qiu, Qinjun
    Tao, Liufeng
    Chen, Jianguo
    Xie, Zhong
    ORE GEOLOGY REVIEWS, 2025, 176
  • [48] SIMPLEFLAT: A SIMPLE WHOLE-NETWORK PRE-TRAINING APPROACH FOR RNN TRANSDUCER-BASED END-TO-END SPEECH RECOGNITION
    Moriya, Takafumi
    Ashihara, Takanori
    Tanaka, Tomohiro
    Ochiai, Tsubasa
    Sato, Hiroshi
    Ando, Atsushi
    Ijima, Yusuke
    Masumura, Ryo
    Shinohara, Yusuke
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 5664 - 5668
  • [49] A two-phase knowledge distillation model for graph convolutional network-based recommendation
    Huang, Zhenhua
    Lin, Zuorui
    Gong, Zheng
    Chen, Yunwen
    Tang, Yong
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (09) : 5902 - 5923
  • [50] Clinical Coding Based on Knowledge Enhanced Language Model and Attention Pooling
    He, Yong
    Li, Weiqing
    Zhang, Shun
    Li, Zhaorong
    Ding, Zixiao
    Zeng, Zhenyu
    HEALTH INFORMATION PROCESSING. EVALUATION TRACK PAPERS, 2023, 1773 : 185 - 205