ETECADx: Ensemble Self-Attention Transformer Encoder for Breast Cancer Diagnosis Using Full-Field Digital X-ray Breast Images

被引:30
作者
Al-Hejri, Aymen M. [1 ,2 ]
Al-Tam, Riyadh M. [1 ,2 ]
Fazea, Muneer [3 ,4 ]
Sable, Archana Harsing [1 ]
Lee, Soojeong [5 ]
Al-antari, Mugahed A. [6 ]
机构
[1] Swami Ramanand Teerth Marathwada Univ, Sch Computat Sci, Nanded 431606, Maharashtra, India
[2] Univ Albaydha, Fac Adm & Comp Sci, Albaydha, Yemen
[3] Al Maamon Diagnost Ctr, Dept Radiol, Sanaa, Yemen
[4] Ibb Univ Med Sci, Sch Med, Dept Radiol, Ibb, Yemen
[5] Sejong Univ, Dept Comp Engn, Coll Software & Convergence Technol, Daeyang AI Ctr, Seoul 05006, South Korea
[6] Sejong Univ, Dept Artificial Intelligence, Coll Software & Convergence Technol, Daeyang AI Ctr, Seoul 05006, South Korea
基金
新加坡国家研究基金会;
关键词
breast cancer; hybrid CAD system; ensemble transfer learning; convolution neural network (CNN); transformer encoder; expert physician validation and verification; CLASSIFICATION; MAMMOGRAMS; SYSTEM;
D O I
10.3390/diagnostics13010089
中图分类号
R5 [内科学];
学科分类号
1002 ; 100201 ;
摘要
Early detection of breast cancer is an essential procedure to reduce the mortality rate among women. In this paper, a new AI-based computer-aided diagnosis (CAD) framework called ETECADx is proposed by fusing the benefits of both ensemble transfer learning of the convolutional neural networks as well as the self-attention mechanism of vision transformer encoder (ViT). The accurate and precious high-level deep features are generated via the backbone ensemble network, while the transformer encoder is used to diagnose the breast cancer probabilities in two approaches: Approach A (i.e., binary classification) and Approach B (i.e., multi-classification). To build the proposed CAD system, the benchmark public multi-class INbreast dataset is used. Meanwhile, private real breast cancer images are collected and annotated by expert radiologists to validate the prediction performance of the proposed ETECADx framework. The promising evaluation results are achieved using the INbreast mammograms with overall accuracies of 98.58% and 97.87% for the binary and multi-class approaches, respectively. Compared with the individual backbone networks, the proposed ensemble learning model improves the breast cancer prediction performance by 6.6% for binary and 4.6% for multi-class approaches. The proposed hybrid ETECADx shows further prediction improvement when the ViT-based ensemble backbone network is used by 8.1% and 6.2% for binary and multi-class diagnosis, respectively. For validation purposes using the real breast images, the proposed CAD system provides encouraging prediction accuracies of 97.16% for binary and 89.40% for multi-class approaches. The ETECADx has a capability to predict the breast lesions for a single mammogram in an average of 0.048 s. Such promising performance could be useful and helpful to assist the practical CAD framework applications providing a second supporting opinion of distinguishing various breast cancer malignancies.
引用
收藏
页数:30
相关论文
共 68 条
[61]   Breast cancer diagnosis using a multi-verse optimizer-based gradient boosting decision tree [J].
Tabrizchi, Hamed ;
Tabrizchi, Mohammad ;
Tabrizchi, Hamid .
SN APPLIED SCIENCES, 2020, 2 (04)
[62]  
Thuy M.B.H., 2019, Applied Mathematics and Applications, P255
[63]   BreaST-Net: Multi-Class Classification of Breast Cancer from Histopathological Images Using Ensemble of Swin Transformers [J].
Tummala, Sudhakar ;
Kim, Jungeun ;
Kadry, Seifedine .
MATHEMATICS, 2022, 10 (21)
[64]  
Ukwuoma C.C., 2022, J Adv Res
[65]  
Vaswani A, 2017, ADV NEUR IN, V30
[66]   Semi-supervised vision transformer with adaptive token sampling for breast cancer classification [J].
Wang, Wei ;
Jiang, Ran ;
Cui, Ning ;
Li, Qian ;
Yuan, Feng ;
Xiao, Zhifeng .
FRONTIERS IN PHARMACOLOGY, 2022, 13
[67]  
Xi P., 2018, 2018 IEEE INT S MED, P1
[68]   SA-NET: SHUFFLE ATTENTION FOR DEEP CONVOLUTIONAL NEURAL NETWORKS [J].
Zhang, Qing-Long ;
Yang, Yu-Bin .
2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, :2235-2239