Anabranch network for camouflaged object segmentation

被引:352
作者
Trung-Nghia Le [1 ,5 ]
Nguyen, Tam, V [2 ]
Nie, Zhongliang [2 ]
Minh-Triet Tran [3 ]
Sugimoto, Akihiro [4 ]
机构
[1] Grad Univ Adv Studies SOKENDAI, Dept Informat, Tokyo, Japan
[2] Univ Dayton, Dept Comp Sci, Dayton, OH 45469 USA
[3] Univ Sci, VNU HCM, Ho Chi Minh, Vietnam
[4] Natl Inst Informat, Tokyo, Japan
[5] Univ Tokyo, Inst Ind Sci, Tokyo, Japan
关键词
Camouflaged object segmentation; Anabranch network; SALIENT; MODEL;
D O I
10.1016/j.cviu.2019.04.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Camouflaged objects attempt to conceal their texture into the background and discriminating them from the background is hard even for human beings. The main objective of this paper is to explore the camouflaged object segmentation problem, namely, segmenting the camouflaged object(s) for a given image. This problem has not been well studied in spite of a wide range of potential applications including the preservation of wild animals and the discovery of new species, surveillance systems, search-and-rescue missions in the event of natural disasters such as earthquakes, floods or hurricanes. This paper addresses a new challenging problem of camouflaged object segmentation. To address this problem, we provide a new image dataset of camouflaged objects for benchmarking purposes. In addition, we propose a general end-to-end network, called the Anabranch Network, that leverages both classification and segmentation tasks. Different from existing networks for segmentation, our proposed network possesses the second branch for classification to predict the probability of containing camouflaged object(s) in an image, which is then fused into the main branch for segmentation to boost up the segmentation accuracy. Extensive experiments conducted on the newly built dataset demonstrate the effectiveness of our network using various fully convolutional networks.
引用
收藏
页码:45 / 56
页数:12
相关论文
共 56 条
[1]  
Achanta R, 2009, PROC CVPR IEEE, P1597, DOI 10.1109/CVPRW.2009.5206596
[2]  
[Anonymous], C COMP VIS PATT REC
[3]  
[Anonymous], 2014, ADV NEURAL INFORM PR
[4]  
[Anonymous], 2011, INT J ENG SCI TECHNO
[5]  
[Anonymous], MOD APPL SCI
[6]  
[Anonymous], 2017, COMMUN ACM, DOI DOI 10.1145/3065386
[7]  
[Anonymous], IEEE T PATTERN ANAL
[8]  
[Anonymous], 2016, EUR C COMP VIS
[9]  
[Anonymous], 2017, C COMP VIS PATT REC
[10]  
[Anonymous], C COMP VIS PATT REC