Medication Recommendation Based on a Knowledge-enhanced Pre-training Model

被引:3
|
作者
Wang, Mengzhen [1 ]
Chen, Jianhui [2 ]
Lin, Shaofu [1 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Coll Software Engn, Beijing, Peoples R China
[2] Beijing Univ Technol, Beijing Int Collaborat Base Brain Informat & Wisd, Beijing, Peoples R China
来源
PROCEEDINGS OF 2021 IEEE/WIC/ACM INTERNATIONAL CONFERENCE ON WEB INTELLIGENCE AND INTELLIGENT AGENT TECHNOLOGY WORKSHOPS AND SPECIAL SESSIONS: (WI-IAT WORKSHOP/SPECIAL SESSION 2021) | 2021年
关键词
Electronic medical record; Graph Attention Network; Pre-training model; Medication recommendation; NEURAL-NETWORK;
D O I
10.1145/3498851.3498968
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
More and more attention has been paid to electronic medical record (EMR)-based auxiliary diagnosis and treatment, in which medication recommendation is an important research direction. The existing medication recommendation models mainly depend on the data of patients, diagnosis and medications. However, the insufficient amount of clinical data with temporal dependencies becomes a major obstacle. This paper proposes a new knowledge-enhanced pre-training model for medication recommendation. On the one hand, the classification knowledge in diagnostic codes and drug codes is encoded by Graph Attention Network and fused into the clinical data for expanding the data content. On the other hand, a large number of single visit data of EMR are used to create the pre-trained visit model by a modified BERT for expanding the data scale. The experimental results on EMR data from more than 2,000 medical and health institutions in Hainan, China show that the fusion of classification knowledge and pre-training model can effectively improve the accuracy of medication recommendation.
引用
收藏
页码:290 / 294
页数:5
相关论文
共 50 条
  • [31] ERT-GFAN: A multimodal drug–target interaction prediction model based on molecular biology and knowledge-enhanced attention mechanism
    Cheng, Xiaoqing
    Yang, Xixin
    Guan, Yuanlin
    Feng, Yihan
    Computers in Biology and Medicine, 2024, 180
  • [32] PROMISE: A pre-trained knowledge-infused multimodal representation learning framework for medication recommendation
    Wu, Jialun
    Yu, Xinyao
    He, Kai
    Gao, Zeyu
    Gong, Tieliang
    INFORMATION PROCESSING & MANAGEMENT, 2024, 61 (04)
  • [33] Pre-training Enhanced Spatial-temporal Graph Neural Network for Multivariate Time Series Forecasting
    Shao, Zezhi
    Zhang, Zhao
    Wang, Fei
    Xu, Yongjun
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 1567 - 1577
  • [34] Pre-training Fine-tuning data Enhancement method based on active learning
    Cao, Deqi
    Ding, Zhaoyun
    Wang, Fei
    Ma, Haoyang
    2022 IEEE INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS, TRUSTCOM, 2022, : 1447 - 1454
  • [35] Research On Pre-Training Method and Generalization Ability of Big Data Recognition Model of the Internet of Things
    Tan, Junyang
    Xia, Dan
    Dong, Shiyun
    Zhu, Honghao
    Xu, Binshi
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2021, 20 (05)
  • [36] AKA-SafeMed: A safe medication recommendation based on attention mechanism and knowledge augmentation
    Yu, Xiaomei
    Li, Xue
    Zhao, Fangcao
    Yan, Xiaoyan
    Zheng, Xiangwei
    Li, Tao
    INFORMATION SCIENCES, 2024, 670
  • [37] Fine-Grained Drug Interaction Extraction Based on Entity Pair Calibration and Pre-Training Model for Chinese Drug Instructions
    Zhang, Xiaoliang
    Gao, Feng
    Zhou, Lunsheng
    Jing, Shenqi
    Wang, Zhongmin
    Wang, Yongqing
    Miao, Shumei
    Zhang, Xin
    Guo, Jianjun
    Shan, Tao
    Liu, Yun
    INTERNATIONAL JOURNAL ON SEMANTIC WEB AND INFORMATION SYSTEMS, 2022, 18 (01)
  • [38] A novel unsupervised deep transfer learning method based on contrast pre-training for fault diagnosis
    Cao, Jungang
    Yang, Zhe
    Huang, Yunwei
    Guo, Jianwen
    Li, Chuan
    Jiang, Lingli
    Long, Jianyu
    2023 IEEE 2ND INDUSTRIAL ELECTRONICS SOCIETY ANNUAL ON-LINE CONFERENCE, ONCON, 2023,
  • [39] Complementary Mask Self-Supervised Pre-training Based on Teacher-Student Network
    Ye, Shaoxiong
    Huang, Jing
    Zhu, Lifu
    2023 3RD ASIA-PACIFIC CONFERENCE ON COMMUNICATIONS TECHNOLOGY AND COMPUTER SCIENCE, ACCTCS, 2023, : 199 - 206
  • [40] A PHONEME-BASED PRE-TRAINING APPROACH FOR DEEP NEURAL NETWORK WITH APPLICATION TO SPEECH ENHANCEMENT
    Chazan, Shlomo E.
    Gannot, Sharon
    Goldberger, Jacob
    2016 IEEE INTERNATIONAL WORKSHOP ON ACOUSTIC SIGNAL ENHANCEMENT (IWAENC), 2016,