共 50 条
Perception-and-Regulation Network for Salient Object Detection
被引:6
|作者:
Zhu, Jinchao
[1
,2
]
Zhang, Xiaoyu
[1
]
Fang, Xian
[3
]
Wang, Yuxuan
[1
]
Tan, Panlong
[1
]
Liu, Junnan
[4
]
机构:
[1] Nankai Univ, Coll Artificial Intelligence, Tianjin 300350, Peoples R China
[2] Tsinghua Univ, Dept Automat, BNRist, Tianjin 300350, Peoples R China
[3] Nankai Univ, Coll Comp Sci, Tianjin 300350, Peoples R China
[4] Harbin Engn Univ, Coll Intelligent Syst Sci & Engn, Harbin 150001, Peoples R China
基金:
中国国家自然科学基金;
关键词:
Semantics;
Regulation;
Object detection;
Feature extraction;
Convolution;
Logic gates;
Task analysis;
Salient object detection;
convolutional neural networks;
attention mechanism;
global perception;
MODEL;
D O I:
10.1109/TMM.2022.3210366
中图分类号:
TP [自动化技术、计算机技术];
学科分类号:
0812 ;
摘要:
Effective fusion of different types of features is the key to salient object detection (SOD). The majority of the existing network structure designs are based on the subjective experience of scholars, and the process of feature fusion does not consider the relationship between the fused features and the highest-level features. In this paper, we focus on the feature relationship and propose a novel global attention unit, which we term the "perception-and-regulation" (PR) block, that adaptively regulates the feature fusion process by explicitly modelling the interdependencies between features. The perception part uses the structure of the fully connected layers in the classification networks to learn the size and shape of the objects. The regulation part selectively strengthens and weakens the features to be fused. An imitating eye observation module (IEO) is further employed to improve the global perception capabilities of the network. The imitation of foveal vision and peripheral vision enables the IEO to scrutinize highly detailed objects and to organize a broad spatial scene to better segment objects. Sufficient experiments conducted on the SOD datasets demonstrate that the proposed method performs favourably against the 29 state-of-the-art methods.
引用
收藏
页码:6525 / 6537
页数:13
相关论文