A flow-guided self-calibration Siamese network for visual tracking

被引:0
作者
Zhenyang Qu
Hongbo Shi
Shuai Tan
Bing Song
Yang Tao
机构
[1] East China University of Science and Technology,The Key Laboratory of Smart Manufacturing in Energy Chemical Process of the Ministry of Education
来源
The Visual Computer | 2023年 / 39卷
关键词
Visual tracking; Optical flow; Siamese network; Attention;
D O I
暂无
中图分类号
学科分类号
摘要
Existing Siamese-based trackers pay attention to applying online updates to deal with target deformation and occlusion. Despite excellent accuracy and robustness, these trackers are still plagued by model drift due to the cumulative errors from tracking results. Therefore, this paper proposes a flow-guided self-calibration Siamese network to alleviate the model drift. This network focuses on leveraging the rich motion information in the current frame and adaptively optimizing the feature representation of the target. Firstly, to mitigate the lack of motion information during tracking, the optical flow field is estimated before the target location. A gather network is designed carefully to extract the deep motion features from the optical flow. Then, owing to the cumulative errors caused by the tracking result, a novel self-calibration module is introduced to update the appearance model without any tracking results adaptively. The module incorporates appearance features and deep motion features via an attention mechanism. Finally, a synthetic loss function is proposed to obtain expressive deep feature representation by adding a competitive loss between samples to the original loss function. Extensive experiments have demonstrated the effectiveness of the proposed method on benchmarks.
引用
收藏
页码:625 / 637
页数:12
相关论文
共 28 条
  • [1] Abbass MY(2021)A survey on online learning for visual tracking Vis. Comput. 37 993-1014
  • [2] Kwon KC(2019)Multi attention module for visual tracking Pattern Recognit. 87 80-93
  • [3] Kim N(2019)Deep motion and appearance cues for visual tracking Pattern Recognit. Lett. 124 74-81
  • [4] Chen B(2014)High-speed tracking with kernelized correlation filters IEEE Trans. Pattern Anal. Mach. Intell. 37 583-596
  • [5] Li P(2021)Partial tracking method based on siamese network Vis. Comput. 37 587-601
  • [6] Sun C(2019)Visual object tracking by hierarchical attention siamese network IEEE Trans. Cybern. 50 3068-3080
  • [7] Danelljan M(2020)A robust visual tracking method via local feature extraction and saliency detection Vis Comput 36 683-700
  • [8] Bhat G(2015)Object tracking benchmark IEEE Trans. Pattern Anal. Mach. Intell. 37 1834-1848
  • [9] Gladh S(2021)CSART: channel and spatial attention-guided residual learning for real-time object tracking Neurocomputing. 436 260-272
  • [10] Henriques JF(undefined)undefined undefined undefined undefined-undefined