Frame-Event Alignment and Fusion Network for High Frame Rate Tracking

被引:18
|
作者
Zhang, Jiqing [1 ]
Wang, Yuanchen [1 ]
Liu, Wenxi [2 ]
Li, Meng [3 ]
Bai, Jinpeng [1 ]
Yin, Baocai [1 ]
Yang, Xin [1 ]
机构
[1] Dalian Univ Technol, Dalian, Peoples R China
[2] Fuzhou Univ, Fuzhou, Peoples R China
[3] HiSilicon Shanghai Technol Co Ltd, Shanghai, Peoples R China
来源
2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR) | 2023年
基金
中国国家自然科学基金;
关键词
D O I
10.1109/CVPR52729.2023.00943
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most existing RGB-based trackers target low frame rate benchmarks of around 30 frames per second. This setting restricts the tracker's functionality in the real world, especially for fast motion. Event-based cameras as bioinspired sensors provide considerable potential for high frame rate tracking due to their high temporal resolution. However, event-based cameras cannot offer fine-grained texture information like conventional cameras. This unique complementarity motivates us to combine conventional frames and events for high frame rate object tracking under various challenging conditions. In this paper, we propose an end-to-end network consisting of multi-modality alignment and fusion modules to effectively combine meaningful information from both modalities at different measurement rates. The alignment module is responsible for cross-style and cross-frame-rate alignment between frame and event modalities under the guidance of the moving cues furnished by events. While the fusion module is accountable for emphasizing valuable features and suppressing noise information by the mutual complement between the two modalities. Extensive experiments show that the proposed approach outperforms state-of-the-art trackers by a significant margin in high frame rate tracking. With the FE240hz dataset, our approach achieves high frame rate tracking up to 240Hz.
引用
收藏
页码:9781 / 9790
页数:10
相关论文
共 50 条
  • [31] High frame rate picture rate conversion
    Bellers, E. B.
    Janssen, J. G. W.
    IDW '06: PROCEEDINGS OF THE 13TH INTERNATIONAL DISPLAY WORKSHOPS, VOLS 1-3, 2006, : 1985 - 1988
  • [32] FFFN: Frame-By-Frame Feedback Fusion Network for Video Super-Resolution
    Zhu, Jian
    Zhang, Qingwu
    Fei, Lunke
    Cai, Ruichu
    Xie, Yuan
    Sheng, Bin
    Yang, Xiaokang
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 6821 - 6835
  • [33] Combined frame- and event-based detection and tracking
    Liu, Hongjie
    Moeys, Diederik Paul
    Das, Gautham
    Neil, Daniel
    Liu, Shih-Chii
    Delbruck, Tobi
    2016 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2016, : 2511 - 2514
  • [34] High Frame-Rate Television
    Armstrong, M. G.
    Flynn, D. J.
    Hammond, M. E.
    Jolly, S. J. E.
    Salmon, R. A.
    SMPTE MOTION IMAGING JOURNAL, 2009, 118 (07): : 54 - 58
  • [35] Particle Filter Tracking in Low Frame Rate Video
    Zhang Tao
    Fei Shumin
    Wang Lili
    2011 30TH CHINESE CONTROL CONFERENCE (CCC), 2011, : 3254 - 3259
  • [36] Object tracking in low-frame-rate video
    Porikli, F
    Tuzel, O
    IMAGE AND VIDEO COMMUNICATIONS AND PROCESSING 2005, PTS 1 AND 2, 2005, 5685 : 72 - 79
  • [37] High frame-rate television
    Armstrong, M.G.
    Flynn, D.J.
    Hammond, M.E.
    Jolly, S.J.E.
    Salmon, R.A.
    SMPTE Motion Imaging Journal, 2009, 118 (07): : 54 - 58
  • [38] A Robust Tracking System for Low Frame Rate Video
    Zhang, Xiaoqin
    Hu, Weiming
    Xie, Nianhua
    Bao, Hujun
    Maybank, Stephen
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2015, 115 (03) : 279 - 304
  • [39] Collaborative Low Frame Rate UAV Tracking by Proposals
    Wang, Yong
    Zhou, Jiaqi
    Liang, Juntao
    Zhu, Xiangyu
    Qiu, Zhoujingzi
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (11): : 10129 - 10136
  • [40] The Impact of Frame Semantic Annotation Levels, Frame-Alignment Techniques, and Fusion Methods on Factoid Answer Processing
    Ofoghi, Bahadorreza
    Yearwood, John
    Ma, Liping
    JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, 2009, 60 (02): : 247 - 263