ROBUST RGB-T TRACKING VIA CONSISTENCY REGULATED SCENE PERCEPTION

被引:0
|
作者
Kang, Bin [1 ]
Liu, Liwei [1 ]
Zhao, Shihao [1 ]
Du, Songlin [2 ]
机构
[1] Nanjing Univ Posts & Telecommun, Sch Internet Things, Nanjing, Peoples R China
[2] Southeast Univ, Sch Automat, Nanjing, Peoples R China
来源
2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP | 2023年
关键词
RGB-T tracking; Graph-based global reasoning; Self-supervised learning;
D O I
10.1109/ICIP49359.2023.10222904
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
RGB-T tracking has received increasing attention due to its significant advantage under severe weather conditions. Existing RGB-T tracking methods pay close attention to the representation of target appearance, ignoring the importance of scene information. In this paper, we propose a global reasoning-oriented method for RGB-T tracking. In particular, within a multi-task learning framework, our approach adopts a nested global reasoning model to regulate the consistency of scene perception (reasoning the relation between targets and the surrounding semantic regions) in different image domains. Moreover, a meta-unsupervised learning strategy is designed to enforce the nested global reasoning model to utilize partial multi-domain target information for the updating of scene perception. Extensive experiments on GTOT, RGBT210 and LasHeR datasets show the superior performance of our method when compared with related works.
引用
收藏
页码:510 / 514
页数:5
相关论文
共 50 条
  • [1] Cross-Modal Ranking with Soft Consistency and Noisy Labels for Robust RGB-T Tracking
    Li, Chenglong
    Zhu, Chengli
    Huang, Yan
    Tang, Jin
    Wang, Liang
    COMPUTER VISION - ECCV 2018, PT XIII, 2018, 11217 : 831 - 847
  • [2] RGB-T object tracking via sparse response-consistency discriminative correlation filters
    Huang, Yueping
    Li, Xiaofeng
    Lu, Ruitao
    Qi, Naixin
    INFRARED PHYSICS & TECHNOLOGY, 2023, 128
  • [3] Robust RGB-T Tracking via Graph Attention-Based Bilinear Pooling
    Kang, Bin
    Liang, Dong
    Mei, Junxi
    Tan, Xiaoyang
    Zhou, Quan
    Zhang, Dengyin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (12) : 9900 - 9911
  • [4] Region Selective Fusion Network for Robust RGB-T Tracking
    Yu, Zhencheng
    Fan, Huijie
    Wang, Qiang
    Li, Ziwan
    Tang, Yandong
    IEEE SIGNAL PROCESSING LETTERS, 2023, 30 : 1357 - 1361
  • [5] Online Learning Samples and Adaptive Recovery for Robust RGB-T Tracking
    Liu, Jun
    Luo, Zhongqiang
    Xiong, Xingzhong
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (02) : 724 - 737
  • [6] Jointly Modeling Motion and Appearance Cues for Robust RGB-T Tracking
    Zhang, Pengyu
    Zhao, Jie
    Bo, Chunjuan
    Wang, Dong
    Lu, Huchuan
    Yang, Xiaoyun
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 3335 - 3347
  • [7] Channel Exchanging for RGB-T Tracking
    Zhao, Long
    Zhu, Meng
    Ren, Honge
    Xue, Lingjixuan
    SENSORS, 2021, 21 (17)
  • [8] Efficient RGB-T Tracking via Cross-Modality Distillation
    Zhang, Tianlu
    Guo, Hongyuan
    Jiao, Qiang
    Zhang, Qiang
    Han, Jungong
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 5404 - 5413
  • [9] RGB-T object tracking: Benchmark and baseline
    Li, Chenglong
    Liang, Xinyan
    Lu, Yijuan
    Zhao, Nan
    Tang, Jin
    PATTERN RECOGNITION, 2019, 96
  • [10] Dynamic Tracking Aggregation with Transformers for RGB-T Tracking
    Liu, Xiaohu
    Lei, Zhiyong
    JOURNAL OF INFORMATION PROCESSING SYSTEMS, 2023, 19 (01): : 80 - 88