Real-time tracking based on deep feature fusion

被引:4
作者
Pang, Yuhang [1 ]
Li, Fan [1 ]
Qiao, Xiaoya [1 ]
Gilman, Andrew [2 ]
机构
[1] Xi An Jiao Tong Univ, Sch Informat & Commun Engn, Xian 710049, Peoples R China
[2] Massey Univ, Inst Nat & Math Sci, Auckland, New Zealand
基金
美国国家科学基金会;
关键词
Visual tracking; Convolutional neural network; Feature fusion; Correlation filters; ROBUST VISUAL TRACKING; OBJECT TRACKING; NETWORK; MODEL;
D O I
10.1007/s11042-020-09267-w
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep learning-based methods have recently attracted significant attention in visual tracking community, leading to an increase in state-of-the-art tracking performance. However, due to the utilization of more complex models, it has also been accompanied with a decrease in speed. For real-time tracking applications, a careful balance of performance and speed is required. We propose a real-time tracking method based on deep feature fusion, which combines deep learning with kernel correlation filter. First, hierarchical features are extracted from a lightweight pre-trained convolutional neural network. Then, original features of different levels are fused using canonical correlation analysis. Fused features, as well as some original deep features, are used in three kernel correlation filters to track the target. An adaptive update strategy, based on dispersion analysis of response maps for the correlation filters, is proposed to improve robustness to target appearance changes. Different update frequencies are adopted for the three filters to adapt to severe appearance changes. We perform extensive experiments on two benchmarks: OTB-50 and OTB-100. Quantitative and qualitative evaluations show that the proposed tracking method performs favorably against some state-of-the-art methods - even better than algorithms using complex network model. Furthermore, proposed algorithm runs faster than 20 frame per second (FPS) and hence able to achieve near real-time tracking.
引用
收藏
页码:27229 / 27255
页数:27
相关论文
共 68 条
[1]  
[Anonymous], 2017, arXiv preprint arXiv:1704.04057
[2]  
BAO CL, 2012, PROC CVPR IEEE, P1830, DOI DOI 10.1109/CVPR.2012.6247881
[3]   Staple: Complementary Learners for Real-Time Tracking [J].
Bertinetto, Luca ;
Valmadre, Jack ;
Golodetz, Stuart ;
Miksik, Ondrej ;
Torr, Philip H. S. .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :1401-1409
[4]   Fully-Convolutional Siamese Networks for Object Tracking [J].
Bertinetto, Luca ;
Valmadre, Jack ;
Henriques, Joao F. ;
Vedaldi, Andrea ;
Torr, Philip H. S. .
COMPUTER VISION - ECCV 2016 WORKSHOPS, PT II, 2016, 9914 :850-865
[5]  
Bolme DS, 2010, PROC CVPR IEEE, P2544, DOI 10.1109/CVPR.2010.5539960
[6]   ECO: Efficient Convolution Operators for Tracking [J].
Danelljan, Martin ;
Bhat, Goutam ;
Khan, Fahad Shahbaz ;
Felsberg, Michael .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :6931-6939
[7]   Discriminative Scale Space Tracking [J].
Danelljan, Martin ;
Hager, Gustav ;
Khan, Fahad Shahbaz ;
Felsberg, Michael .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2017, 39 (08) :1561-1575
[8]   Beyond Correlation Filters: Learning Continuous Convolution Operators for Visual Tracking [J].
Danelljan, Martin ;
Robinson, Andreas ;
Khan, Fahad Shahbaz ;
Felsberg, Michael .
COMPUTER VISION - ECCV 2016, PT V, 2016, 9909 :472-488
[9]   Convolutional Features for Correlation Filter Based Visual Tracking [J].
Danelljan, Martin ;
Hager, Gustav ;
Khan, Fahad Shahbaz ;
Felsberg, Michael .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOP (ICCVW), 2015, :621-629
[10]  
Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848