BSCA-Net: Bit Slicing Context Attention network for polyp segmentation

被引:30
作者
Lin, Yi [1 ]
Wu, Jichun [1 ]
Xiao, Guobao [1 ]
Guo, Junwen [1 ]
Chen, Geng [2 ]
Ma, Jiayi [3 ]
机构
[1] Minjiang Univ, Coll Comp & Control Engn, Fujian Prov Key Lab Informat Proc & Intelligent Co, Fuzhou 350108, Peoples R China
[2] Northwestern Polytech Univ, Sch Comp Sci & Engn, Natl Engn Lab Integrated Aerosp Ground Ocean Big D, Xian 710072, Peoples R China
[3] Wuhan Univ, Elect Informat Sch, Wuhan 430072, Peoples R China
基金
中国国家自然科学基金;
关键词
Medical image segmentation; Polyp segmentation; Colonoscopy; Attention mechanism;
D O I
10.1016/j.patcog.2022.108917
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a novel Bit-Slicing Context Attention Network (BSCA-Net), an end-to-end net-work, to improve the extraction ability of boundary information for polyp segmentation. The core of BSCA-Net is a new Bit Slice Context Attention (BSCA) module, which exploits the bit-plane slicing infor-mation to effectively extract the boundary information between polyps and the surrounding tissue. In addition, we design a novel Split-Squeeze-Bottleneck-Union (SSBU) module, to exploit the geometrical in-formation from different aspects. Also, based on SSBU, we propose an multipath concat attention decoder (MCAD) and an multipath attention concat encoder (MACE), to further improve the network performance for polyp segmentation. Finally, by combining BSCA, SSBU, MCAD and MACE, the proposed BSCA-Net is able to effectively suppress noises in feature maps, and simultaneously improve the ability of feature ex-pression in different levels, for polyp segmentation. Empirical experiments on five benchmark datasets (Kvasir, CVC-ClinicDB, ETIS, CVC-ColonDB and CVC-30 0) demonstrate the superior of the proposed BSCA-Net over existing cutting-edge methods.(c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页数:11
相关论文
共 46 条
[1]  
Akbari M, 2018, IEEE ENG MED BIO, P69, DOI 10.1109/EMBC.2018.8512197
[2]   WM-DOVA maps for accurate polyp highlighting in colonoscopy: Validation vs. saliency maps from physicians [J].
Bernal, Jorge ;
Javier Sanchez, F. ;
Fernandez-Esparrach, Gloria ;
Gil, Debora ;
Rodriguez, Cristina ;
Vilarino, Fernando .
COMPUTERIZED MEDICAL IMAGING AND GRAPHICS, 2015, 43 :99-111
[3]   Fully Convolutional Neural Networks for Polyp Segmentation in Colonoscopy [J].
Brandao, Patrick ;
Mazomenos, Evangelos ;
Ciuti, Gastone ;
Calio, Renato ;
Bianchi, Federico ;
Menciassi, Arianna ;
Dario, Paolo ;
Koulaouzidis, Anastasios ;
Arezzo, Alberto ;
Stoyanov, Danail .
MEDICAL IMAGING 2017: COMPUTER-AIDED DIAGNOSIS, 2017, 10134
[4]  
Chen J., 2021, arXiv
[5]   DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs [J].
Chen, Liang-Chieh ;
Papandreou, George ;
Kokkinos, Iasonas ;
Murphy, Kevin ;
Yuille, Alan L. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (04) :834-848
[6]   Reverse Attention for Salient Object Detection [J].
Chen, Shuhan ;
Tan, Xiuli ;
Wang, Ben ;
Hu, Xuelong .
COMPUTER VISION - ECCV 2018, PT IX, 2018, 11213 :236-252
[7]  
Deng-Ping Fan, 2020, Medical Image Computing and Computer Assisted Intervention - MICCAI 2020. 23rd International Conference. Proceedings. Lecture Notes in Computer Science (LNCS 12266), P263, DOI 10.1007/978-3-030-59725-2_26
[8]   Selective Feature Aggregation Network with Area-Boundary Constraints for Polyp Segmentation [J].
Fang, Yuqi ;
Chen, Cheng ;
Yuan, Yixuan ;
Tong, Kai-yu .
MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2019, PT I, 2019, 11764 :302-310
[9]   Res2Net: A New Multi-Scale Backbone Architecture [J].
Gao, Shang-Hua ;
Cheng, Ming-Ming ;
Zhao, Kai ;
Zhang, Xin-Yu ;
Yang, Ming-Hsuan ;
Torr, Philip .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (02) :652-662
[10]   Learn to Threshold: ThresholdNet With Confidence-Guided Manifold Mixup for Polyp Segmentation [J].
Guo, Xiaoqing ;
Yang, Chen ;
Liu, Yajie ;
Yuan, Yixuan .
IEEE TRANSACTIONS ON MEDICAL IMAGING, 2021, 40 (04) :1134-1146