RRANet: A Reverse Region-Aware Network with Edge Difference for Accurate Breast Tumor Segmentation in Ultrasound Images

被引:0
作者
Chen, Zhengyu [1 ]
Song, Xiaoning [1 ,2 ]
Hua, Yang [1 ]
Zhan, Wenjie [1 ]
机构
[1] Jiangnan Univ, Sch Artificial Intelligence & Comp Sci, Wuxi, Jiangsu, Peoples R China
[2] DiTu Suzhou Biotechnol Co Ltd, Suzhou, Peoples R China
来源
PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2024, PT XIV | 2025年 / 15044卷
基金
中国国家自然科学基金;
关键词
Breast tumor; Edge prior information; Region-aware model; Segmentation; FEATURES;
D O I
10.1007/978-981-97-8496-7_35
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Breast UltraSound (BUS) image segmentation is crucial for the diagnosis and analysis of breast cancer. However, most existing methods for BUS tend to overlook the vital edge information. Meanwhile, noise, similar intensity distribution, varying tumor shape and size, will lead to severe missed detection and false detection. To address these issues, we propose a Reverse Region-Aware Network with Edge Difference, called RRANet, which learns edge information and region information from low-level features and high-level features, respectively. Specifically, we first design Edge Difference Convolution (EDC) to fully mine edge information. EDC aggregates intensity and gradient information to obtain edge details of low-level features in both horizontal and vertical directions. Next, we propose a Multi-Scale Adaptive Module (MSAM) that can effectively extract global information from high-level features. MSAM encodes features in the spatial dimension, which expands the receptive field and captures more local contextual information. In addition, we develop the Reverse Region-Aware Module (RRAM) to gradually refine the global information. This module can establish the relationship between region and edge cues, while correcting some erroneous predictions. Finally, the edge information and global information are fused to improve the prediction accuracy of BUS images. Extensive experiments on three challenging public BUS datasets show that our model outperforms several state-of-the-art medical image segmentation methods on benchmark datasets.
引用
收藏
页码:504 / 517
页数:14
相关论文
共 38 条
[1]  
Abraham N, 2019, I S BIOMED IMAGING, P683, DOI 10.1109/ISBI.2019.8759329
[2]   Dataset of breast ultrasound images [J].
Al-Dhabyani, Walid ;
Gomaa, Mohammed ;
Khaled, Hussien ;
Fahmy, Aly .
DATA IN BRIEF, 2020, 28
[3]   Development of a Deep-Learning-Based Method for Breast Ultrasound Image Segmentation [J].
Almajalid, Rania ;
Shan, Juan ;
Du, Yaodong ;
Zhang, Ming .
2018 17TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2018, :1103-1108
[4]  
[Anonymous], 2015, J. Ecol. Health Environ
[5]   Current and future burden of breast cancer: Global statistics for 2020 and 2040 [J].
Arnold, Melina ;
Morgan, Eileen ;
Rumgay, Harriet ;
Mafra, Allini ;
Singh, Deependra ;
Laversanne, Mathieu ;
Vignat, Jerome ;
Gralow, Julie R. ;
Cardoso, Fatima ;
Siesling, Sabine ;
Soerjomataram, Isabelle .
BREAST, 2022, 66 :15-23
[6]   Breast mass segmentation in ultrasound with selective kernel U-Net convolutional neural network [J].
Byra, Michal ;
Jarosik, Piotr ;
Szubert, Aleksandra ;
Galperin, Michael ;
Ojeda-Fournier, Haydee ;
Olson, Linda ;
O'Boyle, Mary ;
Comstock, Christopher ;
Andre, Michael .
BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2020, 61
[7]  
Chakraborty S, 2016, PROCEEDINGS OF THE 2016 IEEE REGION 10 CONFERENCE (TENCON), P1241, DOI 10.1109/TENCON.2016.7848209
[8]   Rethinking the unpretentious U-net for medical ultrasound image segmentation [J].
Chen, Gongping ;
Li, Lei ;
Zhang, Jianxun ;
Dai, Yu .
PATTERN RECOGNITION, 2023, 142
[9]  
Chen Gongping, 2022, IEEE Transactions on Medical Imaging
[10]   A tutorial on the cross-entropy method [J].
De Boer, PT ;
Kroese, DP ;
Mannor, S ;
Rubinstein, RY .
ANNALS OF OPERATIONS RESEARCH, 2005, 134 (01) :19-67