Real-time tracking based on deep feature fusion

被引:4
作者
Pang, Yuhang [1 ]
Li, Fan [1 ]
Qiao, Xiaoya [1 ]
Gilman, Andrew [2 ]
机构
[1] Xi An Jiao Tong Univ, Sch Informat & Commun Engn, Xian 710049, Peoples R China
[2] Massey Univ, Inst Nat & Math Sci, Auckland, New Zealand
基金
美国国家科学基金会;
关键词
Visual tracking; Convolutional neural network; Feature fusion; Correlation filters; ROBUST VISUAL TRACKING; OBJECT TRACKING; NETWORK; MODEL;
D O I
10.1007/s11042-020-09267-w
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep learning-based methods have recently attracted significant attention in visual tracking community, leading to an increase in state-of-the-art tracking performance. However, due to the utilization of more complex models, it has also been accompanied with a decrease in speed. For real-time tracking applications, a careful balance of performance and speed is required. We propose a real-time tracking method based on deep feature fusion, which combines deep learning with kernel correlation filter. First, hierarchical features are extracted from a lightweight pre-trained convolutional neural network. Then, original features of different levels are fused using canonical correlation analysis. Fused features, as well as some original deep features, are used in three kernel correlation filters to track the target. An adaptive update strategy, based on dispersion analysis of response maps for the correlation filters, is proposed to improve robustness to target appearance changes. Different update frequencies are adopted for the three filters to adapt to severe appearance changes. We perform extensive experiments on two benchmarks: OTB-50 and OTB-100. Quantitative and qualitative evaluations show that the proposed tracking method performs favorably against some state-of-the-art methods - even better than algorithms using complex network model. Furthermore, proposed algorithm runs faster than 20 frame per second (FPS) and hence able to achieve near real-time tracking.
引用
收藏
页码:27229 / 27255
页数:27
相关论文
共 50 条
[21]   Siamese Visual Tracking With Deep Features and Robust Feature Fusion [J].
Li, Daqun ;
Wang, Xize ;
Yu, Yi .
IEEE ACCESS, 2020, 8 :3863-3874
[22]   Real-Time Semantic Segmentation Algorithm for Street Scenes Based on Attention Mechanism and Feature Fusion [J].
Wu, Bao ;
Xiong, Xingzhong ;
Wang, Yong .
ELECTRONICS, 2024, 13 (18)
[23]   Deeper Siamese network with multi-level feature fusion for real-time visual tracking [J].
Yang, Kang ;
Song, Huihui ;
Zhang, Kaihua ;
Fan, Jiaqing .
ELECTRONICS LETTERS, 2019, 55 (13) :742-744
[24]   Improving Real-Time Object Tracking Through Adaptive Feature Fusion and Resampling in Particle Filters [J].
Naznin, Feroza ;
Alam, Md Shoab ;
Sathi, Samia Alam ;
Islam, Md Zahidul .
HCI INTERNATIONAL 2024 POSTERS, PT VII, HCII 2024, 2024, 2120 :114-127
[25]   Accelerated Particle Filter for Real-Time Visual Tracking With Decision Fusion [J].
Li, Shengjie ;
Zhao, Shuai ;
Cheng, Bo ;
Chen, Junliang .
IEEE SIGNAL PROCESSING LETTERS, 2018, 25 (07) :1094-1098
[26]   Real-time and Accurate RFID Tag Localization based on Multiple Feature Fusion [J].
Fu, Shupo ;
Zhang, Shigeng ;
Jiang, Danming ;
Liu, Xuan .
2020 16TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2020), 2020, :694-699
[27]   Real-Time Dynamic Visual-Inertial SLAM and Object Tracking Based on Lightweight Deep Feature Extraction Matching [J].
Zhang, Hanxuan ;
Huo, Ju ;
Huang, Yulong ;
Liu, Qi .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2025, 74
[28]   Learning correlation filter with fused feature and reliable response for real-time tracking [J].
Lin, Bin ;
Xue, Xizhe ;
Li, Ying ;
Shen, Qiang .
JOURNAL OF REAL-TIME IMAGE PROCESSING, 2022, 19 (02) :417-427
[29]   Multi-Task Hierarchical Feature Learning for Real-Time Visual Tracking [J].
Kuai, Yangliu ;
Wen, Gongjian ;
Li, Dongdong .
IEEE SENSORS JOURNAL, 2019, 19 (05) :1961-1968
[30]   Real-time detector design for small targets based on bi-channel feature fusion mechanism [J].
Xiuling Zhang ;
Tingbo Wan ;
Ziyun Wu ;
Bingce Du .
Applied Intelligence, 2022, 52 :2775-2784