Spatial-Temporal Aware Long-Term Object Tracking

被引:4
作者
Zhang, Wei [1 ,2 ]
Kang, Baosheng [2 ]
Zhang, Shunli [2 ]
机构
[1] Baoji Univ Arts & Sci, Sch Comp Sci, Baoji 721016, Peoples R China
[2] Northwest Univ, Sch Informat Sci & Technol, Xian 710127, Peoples R China
来源
IEEE ACCESS | 2020年 / 8卷
基金
中国国家自然科学基金;
关键词
Target tracking; Correlation; Object tracking; Robustness; Uncertainty; Information filtering; Long-term tracking; spatial-temporal information; correlation filter; failure detection; object re-detection; VISUAL TRACKING;
D O I
10.1109/ACCESS.2020.2987464
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Most existing trackers perform well in the occurrence of short-term occlusions and appearance and illumination variations, but struggle with the challenges of long-term tracking which include heavy or long-term occlusions, and out-of-view objects. We propose a spatial-temporal aware long-term object tracking method in this paper, to manage the challenges of long term tracking. Firstly, we present a spatial-temporal aware correlation filter, which jointly models the spatial and temporal information of the target within the correlation filter framework to locate a visible target from frame to frame. Then, we design a tracking uncertainty detection mechanism to activate the re-detector in case of track failure. The mechanism relies on the variations of correlation response measured spatially, and the appearance similarity estimated temporally. Finally, we propose a spatial-temporal aware long-term re-detector to recover the target when it becomes visible. In the re-detection process, a large number of candidates are sampled, evaluated, and refined spatially to obtain accurate locations. Additionally, reliable memory retained through conservative template updating enables the recovery of the target by similarity matching. The spatial-temporal information is explicitly encoded in each component, which operates collaboratively to boost the overall performance. Extensive experimental results conducted on publicly available benchmark datasets demonstrate that the proposed method performs favorably when compared to other state-of-the-art trackers.
引用
收藏
页码:71662 / 71684
页数:23
相关论文
共 59 条
[1]  
[Anonymous], 2005 IEEE COMPUTER S, DOI 10.1109/CVPR.2005.177
[2]  
[Anonymous], ARXIV171109594
[3]   Staple: Complementary Learners for Real-Time Tracking [J].
Bertinetto, Luca ;
Valmadre, Jack ;
Golodetz, Stuart ;
Miksik, Ondrej ;
Torr, Philip H. S. .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :1401-1409
[4]   Fully-Convolutional Siamese Networks for Object Tracking [J].
Bertinetto, Luca ;
Valmadre, Jack ;
Henriques, Joao F. ;
Vedaldi, Andrea ;
Torr, Philip H. S. .
COMPUTER VISION - ECCV 2016 WORKSHOPS, PT II, 2016, 9914 :850-865
[5]  
Bolme DS, 2010, PROC CVPR IEEE, P2544, DOI 10.1109/CVPR.2010.5539960
[6]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[7]  
Danelljan M., 2014, P 2014 BRIT MACH VIS, P1
[8]   ECO: Efficient Convolution Operators for Tracking [J].
Danelljan, Martin ;
Bhat, Goutam ;
Khan, Fahad Shahbaz ;
Felsberg, Michael .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :6931-6939
[9]   Discriminative Scale Space Tracking [J].
Danelljan, Martin ;
Hager, Gustav ;
Khan, Fahad Shahbaz ;
Felsberg, Michael .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2017, 39 (08) :1561-1575
[10]   Beyond Correlation Filters: Learning Continuous Convolution Operators for Visual Tracking [J].
Danelljan, Martin ;
Robinson, Andreas ;
Khan, Fahad Shahbaz ;
Felsberg, Michael .
COMPUTER VISION - ECCV 2016, PT V, 2016, 9909 :472-488