BUViTNet: Breast Ultrasound Detection via Vision Transformers

被引:37
作者
Ayana, Gelan [1 ]
Choe, Se-Woon [1 ,2 ]
机构
[1] Kumoh Natl Inst Technol, Dept Med IT Convergence Engn, Gumi 39253, South Korea
[2] Kumoh Natl Inst Technol, Dept IT Convergence Engn, Gumi 39253, South Korea
基金
新加坡国家研究基金会;
关键词
breast cancer; ultrasound; vision transformer; convolutional neural network; transfer learning; CANCER;
D O I
10.3390/diagnostics12112654
中图分类号
R5 [内科学];
学科分类号
1002 ; 100201 ;
摘要
Convolutional neural networks (CNNs) have enhanced ultrasound image-based early breast cancer detection. Vision transformers (ViTs) have recently surpassed CNNs as the most effective method for natural image analysis. ViTs have proven their capability of incorporating more global information than CNNs at lower layers, and their skip connections are more powerful than those of CNNs, which endows ViTs with superior performance. However, the effectiveness of ViTs in breast ultrasound imaging has not yet been investigated. Here, we present BUViTNet breast ultrasound detection via ViTs, where ViT-based multistage transfer learning is performed using ImageNet and cancer cell image datasets prior to transfer learning for classifying breast ultrasound images. We utilized two publicly available ultrasound breast image datasets, Mendeley and breast ultrasound images (BUST), to train and evaluate our algorithm. The proposed method achieved the highest area under the receiver operating characteristics curve (AUC) of 1 +/- 0, Matthew's correlation coefficient (MCC) of 1 +/- 0, and kappa score of 1 +/- 0 on the Mendeley dataset. Furthermore, BUViTNet achieved the highest AUC of 0.968 +/- 0.02, MCC of 0.961 +/- 0.01, and kappa score of 0.959 +/- 0.02 on the BUSI dataset. BUViTNet outperformed ViT trained from scratch, ViT-based conventional transfer learning, and CNN-based transfer learning in classifying breast ultrasound images (p < 0.01 in all cases). Our findings indicate that improved transformers are effective in analyzing breast images and can provide an improved diagnosis if used in clinical settings. Future work will consider the use of a wide range of datasets and parameters for optimized performance.
引用
收藏
页数:14
相关论文
共 33 条
[1]   Diagnostic accuracy of deep learning in medical imaging: a systematic review and meta-analysis [J].
Aggarwal, Ravi ;
Sounderajah, Viknesh ;
Martin, Guy ;
Ting, Daniel S. W. ;
Karthikesalingam, Alan ;
King, Dominic ;
Ashrafian, Hutan ;
Darzi, Ara .
NPJ DIGITAL MEDICINE, 2021, 4 (01)
[2]   Dataset of breast ultrasound images [J].
Al-Dhabyani, Walid ;
Gomaa, Mohammed ;
Khaled, Hussien ;
Fahmy, Aly .
DATA IN BRIEF, 2020, 28
[3]   Ultrasound-Responsive Nanocarriers for Breast Cancer Chemotherapy [J].
Ayana, Gelan ;
Ryu, Jaemyung ;
Choe, Se-woon .
MICROMACHINES, 2022, 13 (09)
[4]   De-Speckling Breast Cancer Ultrasound Images Using a Rotationally Invariant Block Matching Based Non-Local Means (RIBM-NLM) Method [J].
Ayana, Gelan ;
Dese, Kokeb ;
Raj, Hakkins ;
Krishnamoorthy, Janarthanan ;
Kwa, Timothy .
DIAGNOSTICS, 2022, 12 (04)
[5]   Patchless Multi-Stage Transfer Learning for Improved Mammographic Breast Mass Classification [J].
Ayana, Gelan ;
Park, Jinhyung ;
Choe, Se-woon .
CANCERS, 2022, 14 (05)
[6]   A Novel Multistage Transfer Learning for Ultrasound Breast Cancer Image Classification [J].
Ayana, Gelan ;
Park, Jinhyung ;
Jeong, Jin-Woo ;
Choe, Se-woon .
DIAGNOSTICS, 2022, 12 (01)
[7]   Transfer Learning in Breast Cancer Diagnoses via Ultrasound Imaging [J].
Ayana, Gelan ;
Dese, Kokeb ;
Choe, Se-woon .
CANCERS, 2021, 13 (04) :1-16
[8]   Pros and Cons for Automated Breast Ultrasound (ABUS): A Narrative Review [J].
Boca , Ioana ;
Ciurea, Anca Ileana ;
Ciortea, Cristiana Augusta ;
Dudea, Sorin Marian .
JOURNAL OF PERSONALIZED MEDICINE, 2021, 11 (08)
[9]   Multi-label transfer learning for the early diagnosis of breast cancer [J].
Chougrad, Hiba ;
Zouaki, Hamid ;
Alheyane, Omar .
NEUROCOMPUTING, 2020, 392 :168-180
[10]  
Cuenat Stephane, 2022, 2022 2nd International Conference on Computer, Control and Robotics (ICCCR)., P235, DOI 10.1109/ICCCR54399.2022.9790134