Deeply Supervised Salient Object Detection with Short Connections

被引:708
作者
Hou, Qibin [1 ]
Cheng, Ming-Ming [1 ]
Hu, Xiaowei [1 ]
Borji, Ali [2 ]
Tu, Zhuowen [3 ]
Torr, Philip [4 ]
机构
[1] Nankai Univ, CCCE, Tianjin, Peoples R China
[2] UCF, CRCV, Orlando, FL USA
[3] Univ Calif San Diego, San Diego, CA USA
[4] Univ Oxford, Oxford, England
来源
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017) | 2017年
基金
英国工程与自然科学研究理事会;
关键词
MODEL;
D O I
10.1109/CVPR.2017.563
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent progress on saliency detection is substantial, benefiting mostly from the explosive development of Convolutional Neural Networks (CNNs). Semantic segmentation and saliency detection algorithms developed lately have been mostly based on Fully Convolutional Neural Networks (FCNs). There is still a large room for improvement over the generic FCN models that do not explicitly deal with the scale-space problem. Holisitcally-Nested Edge Detector (HED) provides a skip-layer structure with deep supervision for edge and boundary detection, but the performance gain of HED on saliency detection is not obvious. In this paper, we propose a new saliency method by introducing short connections to the skip-layer structures within the HED architecture. Our framework provides rich multi-scale feature maps at each layer, a property that is critically needed to perform segment detection. Our method produces stateof- the-art results on 5 widely tested salient object detection benchmarks, with advantages in terms of efficiency (0.08 seconds per image), effectiveness, and simplicity over the existing algorithms.
引用
收藏
页码:5300 / 5309
页数:10
相关论文
共 53 条
[1]  
[Anonymous], CVPR
[2]  
[Anonymous], CVPR
[3]  
[Anonymous], CVPR
[4]  
[Anonymous], VISUAL COMPUTER
[5]  
[Anonymous], 2015, IEEE T IMAGE PROCESS, DOI DOI 10.1109/TIP.2015.2487833
[6]  
[Anonymous], ECCV
[7]  
[Anonymous], CVPR
[8]  
[Anonymous], CVPR
[9]  
[Anonymous], 2011, NeurIPS
[10]  
[Anonymous], 2015, J WIREL COMMUN NETW, DOI DOI 10.1371/J0URNAL.P0NE.0126066)