Explainable attention based breast tumor segmentation using a combination of UNet, ResNet, DenseNet, and EfficientNet models

被引:13
作者
Anari, Shokofeh [1 ]
Sadeghi, Soroush [2 ]
Sheikhi, Ghazaal [3 ]
Ranjbarzadeh, Ramin [4 ]
Bendechache, Malika [5 ]
机构
[1] Islamic Azad Univ, Dept Accounting Econ & Financial Sci, South Tehran Branch, Tehran, Iran
[2] Univ Tehran, Sch Elect & Comp Engn, Tehran, Iran
[3] Final Int Univ, Mersin 10, Kyrenia, North Cyprus, Cyprus
[4] Dublin City Univ, Fac Engn & Comp, Sch Comp Comp, Dublin, Ireland
[5] Univ Galway, ADAPT Res Ctr, Sch Comp Sci, Galway, Ireland
关键词
Breast tumor segmentation; UNet; Grad-CAM; Non-local attention; Attention mechanisms; U-NET; DEEP;
D O I
10.1038/s41598-024-84504-y
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
This study utilizes the Breast Ultrasound Image (BUSI) dataset to present a deep learning technique for breast tumor segmentation based on a modified UNet architecture. To improve segmentation accuracy, the model integrates attention mechanisms, such as the Convolutional Block Attention Module (CBAM) and Non-Local Attention, with advanced encoder architectures, including ResNet, DenseNet, and EfficientNet. These attention mechanisms enable the model to focus more effectively on relevant tumor areas, resulting in significant performance improvements. Models incorporating attention mechanisms outperformed those without, as reflected in superior evaluation metrics. The effects of Dice Loss and Binary Cross-Entropy (BCE) Loss on the model's performance were also analyzed. Dice Loss maximized the overlap between predicted and actual segmentation masks, leading to more precise boundary delineation, while BCE Loss achieved higher recall, improving the detection of tumor areas. Grad-CAM visualizations further demonstrated that attention-based models enhanced interpretability by accurately highlighting tumor areas. The findings denote that combining advanced encoder architectures, attention mechanisms, and the UNet framework can yield more reliable and accurate breast tumor segmentation. Future research will explore the use of multi-modal imaging, real-time deployment for clinical applications, and more advanced attention mechanisms to further improve segmentation performance.
引用
收藏
页数:39
相关论文
共 78 条
[1]   On the Use of a Convolutional Block Attention Module in Deep Learning-Based Human Activity Recognition with Motion Sensors [J].
Agac, Sumeyye ;
Incel, Ozlem Durmaz .
DIAGNOSTICS, 2023, 13 (11)
[2]   Multi-scale dual-channel feature embedding decoder for biomedical image segmentation [J].
Agarwal, Rohit ;
Ghosal, Palash ;
Sadhu, Anup K. ;
Murmu, Narayan ;
India, Debashis Nandi .
COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2024, 257
[3]   Dataset of breast ultrasound images [J].
Al-Dhabyani, Walid ;
Gomaa, Mohammed ;
Khaled, Hussien ;
Fahmy, Aly .
DATA IN BRIEF, 2020, 28
[4]  
Anari S., 2024, EfficientUNetViT: efficient breast tumor segmentation utilizing U-Net architecture and pretrained vision transformer, DOI [10.20944/PREPRINTS202408.1015.V1, DOI 10.20944/PREPRINTS202408.1015.V1]
[5]   Review of Deep Learning Approaches for Thyroid Cancer Diagnosis [J].
Anari, Shokofeh ;
Sarshar, Nazanin Tataei ;
Mahjoori, Negin ;
Dorosti, Shadi ;
Rezaie, Amirali .
MATHEMATICAL PROBLEMS IN ENGINEERING, 2022, 2022
[6]  
[Anonymous], 2024, 2024 IEEE STUD C ENG, P1, DOI [10.1109/SCES61914.2024.10652422, DOI 10.1109/SCES61914.2024.10652422]
[7]  
Bagherian Kasgari A., 2023, MRI, V2, P345, DOI [DOI 10.1007/978-3-031-42685-810, 10.1007/978-3-031-42685-810]
[8]   Image Augmentation based on Variational Autoencoder for Breast Tumor Segmentation [J].
Balaji, K. .
ACADEMIC RADIOLOGY, 2023, 30 :S172-S183
[9]   Lightweight EfficientNetB3 Model Based on Depthwise Separable Convolutions for Enhancing Classification of Leukemia White Blood Cell Images [J].
Batool, Amreen ;
Byun, Yung-Cheol .
IEEE ACCESS, 2023, 11 :37203-37215
[10]   Skin Cancer Classification Using Fine-Tuned Transfer Learning of DENSENET-121 [J].
Bello, Abayomi ;
Ng, Sin-Chun ;
Leung, Man-Fai .
APPLIED SCIENCES-BASEL, 2024, 14 (17)