An Attentive Multi-Modal CNN for Brain Tumor Radiogenomic Classification

被引:11
作者
Qu, Ruyi [1 ]
Xiao, Zhifeng [2 ]
机构
[1] Univ Toronto, Dept Math, Toronto, ON M5S 2E4, Canada
[2] Penn State Erie, Behrend Coll, Sch Engn, Erie, PA 16563 USA
关键词
multi-modal medical image; image classification; brain tumor; MGMT METHYLATION STATUS;
D O I
10.3390/info13030124
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Medical images of brain tumors are critical for characterizing the pathology of tumors and early diagnosis. There are multiple modalities for medical images of brain tumors. Fusing the unique features of each modality of the magnetic resonance imaging (MRI) scans can accurately determine the nature of brain tumors. The current genetic analysis approach is time-consuming and requires surgical extraction of brain tissue samples. Accurate classification of multi-modal brain tumor images can speed up the detection process and alleviate patient suffering. Medical image fusion refers to effectively merging the significant information of multiple source images of the same tissue into one image, which will carry abundant information for diagnosis. This paper proposes a novel attentive deep-learning-based classification model that integrates multi-modal feature aggregation, lite attention mechanism, separable embedding, and modal-wise shortcuts for performance improvement. We evaluate our model on the RSNA-MICCAI dataset, a scenario-specific medical image dataset, and demonstrate that the proposed method outperforms the state-of-the-art (SOTA) by around 3%.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] MOAB: MULTI-MODAL OUTER ARITHMETIC BLOCK FOR FUSION OF HISTOPATHOLOGICAL IMAGES AND GENETIC DATA FOR BRAIN TUMOR GRADING
    Alwazzan, Omnia
    Khan, Abbas
    Patras, Ioannis
    Slabaugh, Gregory
    2023 IEEE 20TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING, ISBI, 2023,
  • [22] An Efficient Brain tumor classification using CNN and transfer learning
    Sasikumar, P.
    Cherukuvada, Srikanth
    Balmurugan, P.
    Anand, Vijay P.
    Brindasri, S.
    Nareshkumar, R.
    2024 INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING, COMMUNICATION AND APPLIED INFORMATICS, ACCAI 2024, 2024,
  • [23] Effect of Mixup Enhancement on CNN network for Brain Tumor Classification
    Du Yifei
    Chen Zipei
    Toe, Teoh Teik
    2022 ASIA CONFERENCE ON ALGORITHMS, COMPUTING AND MACHINE LEARNING (CACML 2022), 2022, : 356 - 360
  • [24] Alzheimer's disease classification method based on multi-modal medical images
    Han K.
    Pan H.
    Zhang W.
    Bian X.
    Chen C.
    He S.
    Qinghua Daxue Xuebao/Journal of Tsinghua University, 2020, 60 (08): : 664 - 671and682
  • [25] COVID-19 Hierarchical Classification Using a Deep Learning Multi-Modal
    Althenayan, Albatoul S.
    Alsalamah, Shada A.
    Aly, Sherin
    Nouh, Thamer
    Mahboub, Bassam
    Salameh, Laila
    Alkubeyyer, Metab
    Mirza, Abdulrahman
    SENSORS, 2024, 24 (08)
  • [26] PolSAR Image Classification Based on Multi-Modal Contrastive Fully Convolutional Network
    Hua, Wenqiang
    Wang, Yi
    Yang, Sijia
    Jin, Xiaomin
    REMOTE SENSING, 2024, 16 (02)
  • [27] BRAIN TUMOR MRI MEDICAL IMAGES CLASSIFICATION MODEL BASED ON CNN (BTMIC-CNN)
    Al-Galal, Sabaa Ahmed Yahya
    Alshaikhli, Imad Fakhri Taha
    Abdulrazzaq, M. M.
    Hassan, Raini
    JOURNAL OF ENGINEERING SCIENCE AND TECHNOLOGY, 2022, 17 (06): : 4410 - 4432
  • [28] Brain tumor segmentation and classification using hybrid deep CNN with LuNetClassifier
    T. Balamurugan
    E. Gnanamanoharan
    Neural Computing and Applications, 2023, 35 : 4739 - 4753
  • [29] Detection and Classification of Brain Tumor Using Convolutional Neural Network (CNN)
    Deshmukh, Smita
    Tiwari, Divya
    MACHINE LEARNING AND BIG DATA ANALYTICS (PROCEEDINGS OF INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND BIG DATA ANALYTICS (ICMLBDA) 2021), 2022, 256 : 289 - 303
  • [30] Brain tumor segmentation and classification using hybrid deep CNN with LuNetClassifier
    Balamurugan, T.
    Gnanamanoharan, E.
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (06) : 4739 - 4753