Deeply Supervised Salient Object Detection with Short Connections

被引:546
作者
Hou, Qibin [1 ]
Cheng, Ming-Ming [1 ]
Hu, Xiaowei [1 ]
Borji, Ali [2 ]
Tu, Zhuowen [3 ]
Torr, Philip H. S. [4 ]
机构
[1] Nankai Univ, CCCE, Nankai 300071, Qu, Peoples R China
[2] Univ Cent Florida, Ctr Res Comp Vis, Orlando, FL 32816 USA
[3] Univ Calif San Diego, La Jolla, CA 92093 USA
[4] Univ Oxford, Oxford OX1 2JD, England
基金
英国工程与自然科学研究理事会;
关键词
Salient object detection; short connection; deeply supervised network; semantic segmentation; edge detection; IMAGE; ATTENTION; GRAPHICS; MODEL;
D O I
10.1109/TPAMI.2018.2815688
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent progress on salient object detection is substantial, benefiting mostly from the explosive development of Convolutional Neural Networks (CNNs). Semantic segmentation and salient object detection algorithms developed lately have been mostly based on Fully Convolutional Neural Networks (FCNs). There is still a large room for improvement over the generic FCN models that do not explicitly deal with the scale-space problem. The Holistically-Nested Edge Detector (HED) provides a skip-layer structure with deep supervision for edge and boundary detection, but the performance gain of HED on saliency detection is not obvious. In this paper, we propose a new salient object detection method by introducing short connections to the skip-layer structures within the HED architecture. Our framework takes full advantage of multi-level and multi-scale features extracted from FCNs, providing more advanced representations at each layer, a property that is critically needed to perform segment detection. Our method produces state-of-the-art results on 5 widely tested salient object detection benchmarks, with advantages in terms of efficiency (0.08 seconds per image), effectiveness, and simplicity over the existing algorithms. Beyond that, we conduct an exhaustive analysis of the role of training data on performance. We provide a training set for future research and fair comparisons.
引用
收藏
页码:815 / 828
页数:14
相关论文
共 50 条
  • [21] Salient object detection via spectral matting
    Naqvi, Syed S.
    Browne, Will N.
    Hollitt, Christopher
    PATTERN RECOGNITION, 2016, 51 : 209 - 224
  • [22] Decomposition and Completion Network for Salient Object Detection
    Wu, Zhe
    Su, Li
    Huang, Qingming
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 6226 - 6239
  • [23] Salient Object Detection via Distribution of Contrast
    Huang, Xiaoming
    IMAGE AND GRAPHICS, ICIG 2019, PT I, 2019, 11901 : 553 - 565
  • [24] The Retrieval of the Beautiful: Self-Supervised Salient Object Detection for Beauty Product Retrieval
    Wang, Jiawei
    Zhu, Shuai
    Xu, Jiao
    Cao, Da
    PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA (MM'19), 2019, : 2548 - 2552
  • [25] Joint salient object detection and existence prediction
    Jiang, Huaizu
    Cheng, Ming-Ming
    Li, Shi-Jie
    Borji, Ali
    Wang, Jingdong
    FRONTIERS OF COMPUTER SCIENCE, 2019, 13 (04) : 778 - 788
  • [26] What is a Salient Object? A Dataset and a Baseline Model for Salient Object Detection
    Borji, Ali
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2015, 24 (02) : 742 - 756
  • [27] Robust Background Exclusion for Salient Object Detection
    Hu, Yuming
    Zhou, Quan
    Gao, Guangwei
    Yao, Zhijun
    Ou, Weihua
    Latecki, Longin Jan
    2016 8TH INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS & SIGNAL PROCESSING (WCSP), 2016,
  • [28] Background-Driven Salient Object Detection
    Wang, Zilei
    Xiang, Dao
    Hou, Saihui
    Wu, Feng
    IEEE TRANSACTIONS ON MULTIMEDIA, 2017, 19 (04) : 750 - 762
  • [29] Salient object detection: A survey
    Borji, Ali
    Cheng, Ming-Ming
    Hou, Qibin
    Jiang, Huaizu
    Li, Jia
    COMPUTATIONAL VISUAL MEDIA, 2019, 5 (02) : 117 - 150
  • [30] Salient Object Detection via Integrity Learning
    Zhuge, Mingchen
    Fan, Deng-Ping
    Liu, Nian
    Zhang, Dingwen
    Xu, Dong
    Shao, Ling
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (03) : 3738 - 3752