RAE-Net: a multi-modal neural network based on feature fusion and evidential deep learning algorithm in predicting breast cancer subtypes on DCE-MRI

被引:0
|
作者
Tang, Xiaowen [1 ]
Zhu, Yinsu [1 ]
机构
[1] Nanjing Med Univ, Jiangsu Inst Canc Res, Dept Radiol,Jiangsu Canc Hosp, Jiangsu Canc Hosp,Affiliated Canc Hosp, 42 Baiziting, Nanjing 210009, Jiangsu, Peoples R China
来源
BIOMEDICAL PHYSICS & ENGINEERING EXPRESS | 2025年 / 11卷 / 02期
关键词
breast neoplasms; magnetic resonance imaging; deep learning; machine learning; radiomics; IMAGES;
D O I
10.1088/2057-1976/adb494
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
Objectives Accurate identification of molecular subtypes in breast cancer is critical for personalized treatment. This study introduces a novel neural network model, RAE-Net, based on Multimodal Feature Fusion (MFF) and the Evidential Deep Learning Algorithm (EDLA) to improve breast cancer subtype prediction using dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). Methods A dataset of 344 patients with histologically confirmed breast cancer was divided into training (n = 200), validation (n = 60), and testing (n = 62) cohorts. RAE-Net, built on ResNet-50 with Multi-Head Attention (MHA) fusion and Multi-Layer Perceptron (MLP) mechanisms, combines radiomic and deep learning features for subtype prediction. The EDLA module adds uncertainty estimation to enhance classification reliability. Results The RAE-Net model incorporating the MFF module demonstrated superior performance, achieving a mean accuracy of 0.83 and a Macro-F1 score of 0.78, surpassing traditional radiomics models (accuracy: 0.79, Macro-F1: 0.75) and standalone deep learning models (accuracy: 0.80, Macro-F1: 0.76). When an EDLA uncertainty threshold of 0.2 was applied, the performance significantly improved, with accuracy reaching 0.97 and Macro-F1 increasing to 0.92. Additionally, RAE-Net outperformed two recent deep learning networks, ResGANet and HIFUSE. Specifically, RAE-Net showed a 0.5% improvement in accuracy and a higher AUC compared to ResGANet. In comparison to HIFUSE, RAE-Net reduced both the number of parameters and computational cost by 90% while only increasing computation time by 5.7%. Conclusions RAE-Net integrates feature fusion and uncertainty estimation to predict breast cancer subtypes from DCE-MRI. The model achieves high accuracy while maintaining computational efficiency, demonstrating its potential for clinical use as a reliable and resource-efficient diagnostic tool.
引用
收藏
页数:12
相关论文
共 42 条
  • [1] Combination of DCE-MRI and NME-DWI via Deep Neural Network for Predicting Breast Cancer Molecular Subtypes
    Ba, Zhi-Chang
    Zhang, Hong-Xia
    Liu, Ao-Yu
    Zhou, Xin-Xiang
    Liu, Lu
    Wang, Xin-Yi
    Nanding, Abiyasi
    Sang, Xi-Qiao
    Kuai, Zi-Xiang
    CLINICAL BREAST CANCER, 2024, 24 (05) : E417 - E427
  • [2] Prediction of breast cancer molecular subtypes on DCE-MRI using convolutional neural network with transfer learning between two centers
    Zhang, Yang
    Chen, Jeon-Hor
    Lin, Yezhi
    Chan, Siwa
    Zhou, Jiejie
    Chow, Daniel
    Chang, Peter
    Kwong, Tiffany
    Yeh, Dah-Cherng
    Wang, Xinxin
    Parajuli, Ritesh
    Mehta, Rita S.
    Wang, Meihao
    Su, Min-Ying
    EUROPEAN RADIOLOGY, 2021, 31 (04) : 2559 - 2567
  • [3] Prediction of breast cancer molecular subtypes on DCE-MRI using convolutional neural network with transfer learning between two centers
    Yang Zhang
    Jeon-Hor Chen
    Yezhi Lin
    Siwa Chan
    Jiejie Zhou
    Daniel Chow
    Peter Chang
    Tiffany Kwong
    Dah-Cherng Yeh
    Xinxin Wang
    Ritesh Parajuli
    Rita S. Mehta
    Meihao Wang
    Min-Ying Su
    European Radiology, 2021, 31 : 2559 - 2567
  • [4] DCE-MRI based deep learning analysis of intratumoral subregion for predicting Ki-67 expression level in breast cancer
    Ding, Zhimin
    Zhang, Chengmeng
    Xia, Cong
    Yao, Qi
    Wei, Yi
    Zhang, Xia
    Zhao, Nannan
    Wang, Xiaoming
    Shi, Suhua
    MAGNETIC RESONANCE IMAGING, 2025, 119
  • [5] A Deep Learning Model for Predicting Molecular Subtype of Breast Cancer by Fusing Multiple Sequences of DCE-MRI From Two Institutes
    Xie, Xiaoyang
    Zhou, Haowen
    Ma, Mingze
    Nie, Ji
    Gao, Weibo
    Zhong, Jinman
    Cao, Xin
    He, Xiaowei
    Peng, Jinye
    Hou, Yuqing
    Zhao, Fengjun
    Chen, Xin
    ACADEMIC RADIOLOGY, 2024, 31 (09) : 3479 - 3488
  • [6] A contrastive learning-based neural network to synthesize cell subpopulation features from DCE-MRI for predicting prognosis in breast cancer
    Ge, Yuanyuan
    Fan, Ming
    Li, Xian
    Liu, Yueyue
    Li, Lihua
    IMAGING INFORMATICS FOR HEALTHCARE, RESEARCH, AND APPLICATIONS, MEDICAL IMAGING 2024, 2024, 12931
  • [7] Deep learning supported breast cancer classification with multi-modal image fusion
    Hamdy, Eman
    Zaghloul, Mohamed Saad
    Badawy, Osama
    2021 22ND INTERNATIONAL ARAB CONFERENCE ON INFORMATION TECHNOLOGY (ACIT), 2021, : 319 - 325
  • [8] Comparison of Breast DCE-MRI Contrast Time Points for Predicting Response to Neoadjuvant Chemotherapy Using Deep Convolutional Neural Network Features with Transfer Learning
    Huynh, Benjamin Q.
    Antropova, Natasha
    Giger, Maryellen L.
    MEDICAL IMAGING 2017: COMPUTER-AIDED DIAGNOSIS, 2017, 10134
  • [9] Deep multi-modal fusion network with gated unit for breast cancer survival prediction
    Yuan, Han
    Xu, Hongzhen
    COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING, 2024, 27 (07) : 883 - 896
  • [10] Moanna: Multi-Omics Autoencoder-Based Neural Network Algorithm for Predicting Breast Cancer Subtypes
    Lupat, Richard
    Perera, Rashindrie
    Loi, Sherene
    Li, Jason
    IEEE ACCESS, 2023, 11 : 10912 - 10924