MIFNet: multimodal interactive fusion network for medication recommendation

被引:1
|
作者
Huo, Jiazhen [1 ]
Hong, Zhikai [1 ]
Chen, Mingzhou [1 ]
Duan, Yongrui [1 ]
机构
[1] Tongji Univ, Sch Econ & Management, Shanghai, Peoples R China
来源
JOURNAL OF SUPERCOMPUTING | 2024年 / 80卷 / 09期
基金
中国国家自然科学基金;
关键词
Electronic health records; Multimodal fusion; Medication recommendation; Temporal event modeling;
D O I
10.1007/s11227-024-05908-1
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Medication recommendation aims to provide clinicians with safe medicine combinations for the treatment of patients. Existing medication recommendation models are built based on the temporal structured code data of electronic health records (EHRs) in the medical database; nevertheless, unstructured data in EHRs, such as textual data containing rich information, are underexploited. To fill the gap, a novel multimodal interactive fusion network (MIFNet) is proposed for medication recommendation, which integrates both structured code information and unstructured text information in EHRs. Our model first extracts a series of informative feature representations to encode comprehensive patient health history and control potential drug-drug interactions (DDI), including medical code, clinical notes, and the DDI knowledge graph. Next, a novel cross-modal interaction extraction block is proposed to capture the intricate interaction information between the two modalities. Finally, a multimodal fusion block is adopted to fuse the constructed features and generate a medication combination list. Experiments are conducted on the public MIMIC-III dataset, and the results demonstrate that the proposed model outperforms the state-of-the-art medication recommendation methods on main evaluation metrics.
引用
收藏
页码:12313 / 12345
页数:33
相关论文
共 50 条
  • [1] Multimodal Interactive Network for Sequential Recommendation
    Teng-Yue Han
    Peng-Fei Wang
    Shao-Zhang Niu
    Journal of Computer Science and Technology, 2023, 38 : 911 - 926
  • [2] Multimodal Interactive Network for Sequential Recommendation
    Han, Teng-Yue
    Wang, Peng-Fei
    Niu, Shao-Zhang
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2023, 38 (04) : 911 - 926
  • [3] Multifactorial modality fusion network for multimodal recommendation
    Chen, Yanke
    Sun, Tianhao
    Ma, Yunhao
    Zou, Huhai
    APPLIED INTELLIGENCE, 2025, 55 (02)
  • [4] MeSIN: Multilevel selective and interactive network for medication recommendation
    An, Yang
    Zhang, Liang
    You, Mao
    Tian, Xueqing
    Jin, Bo
    Wei, Xiaopeng
    KNOWLEDGE-BASED SYSTEMS, 2021, 233
  • [5] MIFNet: A lightweight multiscale information fusion network
    Cheng, Jieren
    Peng, Xin
    Tang, Xiangyan
    Tu, Wenxuan
    Xu, Wenhang
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (09) : 5617 - 5642
  • [6] Interactive Multimodal Learning for Venue Recommendation
    Zahalka, Jan
    Rudinac, Stevan
    Worring, Marcel
    IEEE TRANSACTIONS ON MULTIMEDIA, 2015, 17 (12) : 2235 - 2244
  • [7] Attention-guided Multi-step Fusion: A Hierarchical Fusion Network for Multimodal Recommendation
    Zhou, Yan
    Guo, Jie
    Sun, Hao
    Song, Bin
    Yu, Fei Richard
    PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, : 1816 - 1820
  • [8] MIFNet: Multi-Information Fusion Network for Sea-Land Segmentation
    Pan, Zhihong
    Dou, Hao
    Mao, Jiaxing
    Dai, Min
    Tian, Jinwen
    ICAIP 2018: 2018 THE 2ND INTERNATIONAL CONFERENCE ON ADVANCES IN IMAGE PROCESSING, 2018, : 24 - 29
  • [9] Interactive Fusion Network with Recurrent Attention for Multimodal Aspect-based Sentiment Analysis
    Wang, Jun
    Wang, Qianlong
    Wen, Zhiyuan
    Liang, Xingwei
    Xu, Ruifeng
    ARTIFICIAL INTELLIGENCE, CICAI 2022, PT III, 2022, 13606 : 298 - 309
  • [10] Fish behavior recognition based on an audio-visual multimodal interactive fusion network
    Yang, Yuxin
    Yu, Hong
    Zhang, Xin
    Zhang, Peng
    Tu, Wan
    Gu, Lishuai
    AQUACULTURAL ENGINEERING, 2024, 107