Explainable breast cancer molecular expression prediction using multi-task deep-learning based on 3D whole breast ultrasound

被引:0
作者
Huang, Zengan [1 ]
Zhang, Xin [1 ]
Ju, Yan [2 ]
Zhang, Ge [2 ]
Chang, Wanying [2 ]
Song, Hongping [2 ]
Gao, Yi [1 ]
机构
[1] Shenzhen Univ, Med Sch, Sch Biomed Engn, Shenzhen 518055, Guangdong, Peoples R China
[2] Fourth Mil Med Univ, Xijing Hosp, Dept Ultrasound, 127 Changle West Rd, Xian 710032, Peoples R China
来源
INSIGHTS INTO IMAGING | 2024年 / 15卷 / 01期
基金
中国国家自然科学基金;
关键词
Breast cancer; Deep learning; Ultrasound imaging; MAMMOGRAPHY; NETWORK;
D O I
10.1186/s13244-024-01810-9
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
ObjectivesTo noninvasively estimate three breast cancer biomarkers, estrogen receptor (ER), progesterone receptor (PR), and human epidermal growth factor receptor 2 (HER2) and enhance performance and interpretability via multi-task deep learning.MethodsThe study included 388 breast cancer patients who received the 3D whole breast ultrasound system (3DWBUS) examinations at Xijing Hospital between October 2020 and September 2021. Two predictive models, a single-task and a multi-task, were developed; the former predicts biomarker expression, while the latter combines tumor segmentation with biomarker prediction to enhance interpretability. Performance evaluation included individual and overall prediction metrics, and Delong's test was used for performance comparison. The models' attention regions were visualized using Grad-CAM + + technology.ResultsAll patients were randomly split into a training set (n = 240, 62%), a validation set (n = 60, 15%), and a test set (n = 88, 23%). In the individual evaluation of ER, PR, and HER2 expression prediction, the single-task and multi-task models achieved respective AUCs of 0.809 and 0.735 for ER, 0.688 and 0.767 for PR, and 0.626 and 0.697 for HER2, as observed in the test set. In the overall evaluation, the multi-task model demonstrated superior performance in the test set, achieving a higher macro AUC of 0.733, in contrast to 0.708 for the single-task model. The Grad-CAM + + method revealed that the multi-task model exhibited a stronger focus on diseased tissue areas, improving the interpretability of how the model worked.ConclusionBoth models demonstrated impressive performance, with the multi-task model excelling in accuracy and offering improved interpretability on noninvasive 3DWBUS images using Grad-CAM + + technology.Critical relevance statementThe multi-task deep learning model exhibits effective prediction for breast cancer biomarkers, offering direct biomarker identification and improved clinical interpretability, potentially boosting the efficiency of targeted drug screening.Key PointsTumoral biomarkers are paramount for determining breast cancer treatment.The multi-task model can improve prediction performance, and improve interpretability in clinical practice.The 3D whole breast ultrasound system-based deep learning models excelled in predicting breast cancer biomarkers.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Predicting molecular subtypes of breast cancer based on multi-parametric MRI dataset using deep learning method
    Ren, Wanqing
    Xi, Xiaoming
    Zhang, Xiaodong
    Wang, Kesong
    Liu, Menghan
    Wang, Dawei
    Du, Yanan
    Sun, Jingxiang
    Zhang, Guang
    [J]. MAGNETIC RESONANCE IMAGING, 2025, 117
  • [22] In-Silico Molecular Binding Prediction for Human Drug Targets Using Deep Neural Multi-Task Learning
    Lee, Kyoungyeul
    Kim, Dongsup
    [J]. GENES, 2019, 10 (11)
  • [23] Nonlinear characterization of breast cancer using multi-compression 3D ultrasound elastography in vivo
    Sayed, Ahmed
    Layne, Ginger
    Abraham, Jame
    Mukdadi, Osama
    [J]. ULTRASONICS, 2013, 53 (05) : 979 - 991
  • [24] Message-passing neural network based multi-task deep-learning framework for COSMO-SAC based σ-profile and VCOSMO prediction
    Zhang, Jun
    Wang, Qin
    Shen, Weifeng
    [J]. CHEMICAL ENGINEERING SCIENCE, 2022, 254
  • [25] Predicting Taxi Demand Based on 3D Convolutional Neural Network and Multi-task Learning
    Kuang, Li
    Yan, Xuejin
    Tan, Xianhan
    Li, Shuqi
    Yang, Xiaoxian
    [J]. REMOTE SENSING, 2019, 11 (11)
  • [26] Transition of traditional method to deep learning based computer-aided system for breast cancer using Automated Breast Ultrasound System (ABUS) images: a review
    Pengiran Mohamad, Dayangku Nur Faizah
    Mashohor, Syamsiah
    Mahmud, Rozi
    Hanafi, Marsyita
    Bahari, Norafida
    [J]. ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (12) : 15271 - 15300
  • [27] SHA-MTL: soft and hard attention multi-task learning for automated breast cancer ultrasound image segmentation and classification
    Zhang, Guisheng
    Zhao, Kehui
    Hong, Yanfei
    Qiu, Xiaoyu
    Zhang, Kuixing
    Wei, Benzheng
    [J]. INTERNATIONAL JOURNAL OF COMPUTER ASSISTED RADIOLOGY AND SURGERY, 2021, 16 (10) : 1719 - 1725
  • [28] Deep Multi-label 3D ConvNet for Breast Cancer Diagnosis in DBT with Inversion Augmentation
    Wichakam, Itsara
    Chayakulkheeree, Jatuporn
    Vateekul, Peerapon
    [J]. TENTH INTERNATIONAL CONFERENCE ON DIGITAL IMAGE PROCESSING (ICDIP 2018), 2018, 10806
  • [29] AUTOMATED DIAGNOSIS OF BREAST CANCER USING DEEP LEARNING-BASED WHOLE SLIDE IMAGE ANALYSIS OF MOLECULAR BIOMARKERS
    Aboudessouki, A.
    Ali, Kh. M.
    Elsharkawy, M.
    Alksas, A.
    Mahmoud, A.
    Khalifa, F.
    Ghazal, M.
    Yousaf, J.
    Abu Khalifeh, H.
    El-Baz, A.
    [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 2965 - 2969
  • [30] Weakly supervised 3D deep learning for breast cancer classification and localization of the lesions in MR images
    Zhou, Juan
    Luo, Lu-Yang
    Dou, Qi
    Chen, Hao
    Chen, Cheng
    Li, Gong-Jie
    Jiang, Ze-Fei
    Heng, Pheng-Ann
    [J]. JOURNAL OF MAGNETIC RESONANCE IMAGING, 2019, 50 (04) : 1144 - 1151