Fast Camouflaged Object Detection via Edge-based Reversible Re-calibration Network

被引:161
作者
Ji, Ge-Peng [1 ]
Zhu, Lei [2 ]
Zhuge, Mingchen [3 ]
Fu, Keren [4 ]
机构
[1] Wuhan Univ, Sch Comp Sci, Wuhan, Peoples R China
[2] Hong Kong Univ Sci & Technol Guangzhou, Guangzhou, Peoples R China
[3] China Univ Geosci, Sch Comp Sci, Wuhan, Peoples R China
[4] Sichuan Univ, Coll Comp Sci, Chengdu, Peoples R China
关键词
Camouflaged Object Detection; Reversible Re-calibration Unit; Selective Edge Aggregation; NGES Priors; DIAGNOSIS;
D O I
10.1016/j.patcog.2021.108414
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Camouflaged Object Detection (COD) aims to detect objects with similar patterns (e.g., texture, intensity, colour, etc ) to their surroundings, and recently has attracted growing research interest. As camouflaged objects often present very ambiguous boundaries, how to determine object locations as well as their weak boundaries is challenging and also the key to this task. Inspired by the biological visual perception process when a human observer discovers camouflaged objects, this paper proposes a novel edge-based reversible re-calibration network called ERRNet. Our model is characterized by two innovative designs, namely Selective Edge Aggregation ( SEA ) and Reversible Re-calibration Unit (RRU), which aim to model the visual perception behaviour and achieve effective edge prior and cross-comparison between potential camouflaged regions and background. More importantly, RRU incorporates diverse priors with more comprehensive information comparing to existing COD models. Experimental results show that ERRNet outperforms existing cutting-edge baselines on three COD datasets and five medical image segmentation datasets. Especially, compared with the existing top-1 model SINet, ERRNet significantly improves the performance by similar to 6% (mean E-measure) with notably high speed (79.3 FPS), showing that ERRNet could be a general and robust solution for the COD task. (c) 2021 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 88 条
[1]  
Amit SNKB, 2016, INT GEOSCI REMOTE SE, P5189, DOI 10.1109/IGARSS.2016.7730352
[2]  
[Anonymous], 2015, Dermoscopy Image Anal, DOI [DOI 10.1201/B19107-5, DOI 10.1201/B19107]
[3]   A Survey of Feature Extraction in Dermoscopy Image Analysis of Skin Cancer [J].
Barata, Catarina ;
Celebi, M. Emre ;
Marques, Jorge S. .
IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2019, 23 (03) :1096-1109
[4]   WM-DOVA maps for accurate polyp highlighting in colonoscopy: Validation vs. saliency maps from physicians [J].
Bernal, Jorge ;
Javier Sanchez, F. ;
Fernandez-Esparrach, Gloria ;
Gil, Debora ;
Rodriguez, Cristina ;
Vilarino, Fernando .
COMPUTERIZED MEDICAL IMAGING AND GRAPHICS, 2015, 43 :99-111
[5]   Hybrid Task Cascade for Instance Segmentation [J].
Chen, Kai ;
Pang, Jiangmiao ;
Wang, Jiaqi ;
Xiong, Yu ;
Li, Xiaoxiao ;
Sun, Shuyang ;
Feng, Wansen ;
Liu, Ziwei ;
Shi, Jianping ;
Ouyang, Wanli ;
Loy, Chen Change ;
Lin, Dahua .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :4969-4978
[6]   DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs [J].
Chen, Liang-Chieh ;
Papandreou, George ;
Kokkinos, Iasonas ;
Murphy, Kevin ;
Yuille, Alan L. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (04) :834-848
[7]   Reverse Attention for Salient Object Detection [J].
Chen, Shuhan ;
Tan, Xiuli ;
Wang, Ben ;
Hu, Xuelong .
COMPUTER VISION - ECCV 2018, PT IX, 2018, 11213 :236-252
[8]   A Multi-task Mean Teacher for Semi-supervised Shadow Detection [J].
Chen, Zhihao ;
Zhu, Lei ;
Wan, Liang ;
Wang, Song ;
Feng, Wei ;
Heng, Pheng-Ann .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, :5610-5619
[9]   Attention-based Dropout Layer for Weakly Supervised Object Localization [J].
Choe, Junsuk ;
Shim, Hyunjung .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :2214-2223
[10]  
Chudzik P, 2020, SCI REP-UK, V10, DOI 10.1038/s41598-020-57674-8