Moving Object Tracking based on Kernel and Random-coupled Neural Network

被引:0
作者
Chen, Yiran [1 ,5 ]
Liu, Haoran [1 ,2 ]
Liu, Mingzhe [2 ]
Liu, Yanhua [1 ]
Wang, Ruili [3 ]
Li, Peng [4 ]
机构
[1] Chengdu Univ Technol, Chengdu, Peoples R China
[2] Wenzhou Univ Technol, Wenzhou, Peoples R China
[3] Massey Univ, Auckland, New Zealand
[4] Southwestern Inst Phys, Chengdu, Peoples R China
[5] Chengdu Univ Technol, Coll Nucl Technol & Automat Engn, Chengdu 610059, Peoples R China
来源
PROCEEDINGS OF THE 6TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA IN ASIA, MMASIA 2024 | 2024年
关键词
Moving object tracking; Kernel-based tracking; Radom-coupled neural network; Spiking neural networks; MEAN-SHIFT; ONLINE;
D O I
10.1145/3696409.3700168
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Moving object tracking on cost-effective hardware is a crucial need in numerous research and industrial applications. However, current deep learning-based tracking algorithms usually prioritize exceptional performance at the expense of increased computational load. Due to the unavailability of expensive GPUs for many tracking tasks, these popular trackers often fall short in providing robust tracking capabilities with affordable computational resources. This study introduces RCNNshift, a kernel-based tracker that relies on feature extraction from a random-coupled neural network. This visual cortex inspired neural model can extract image features without requiring cumbersome pre-training or deep neural connections. By utilizing an enhanced one-dimensional feature representation, RCNNshift demonstrates superior performance compared to other kernel-based object tracking methods, even those employing higher-dimensional feature spaces. Its improvement in precision and success plots of OPE, compared to the Meanshift and Camshift in the HSV and RGB color spaces, exceeds over 160% and 190% respectively. Comparative experiments have validated the robustness of RCNNshift, showcasing its superior performance over various kernel-based and particle filter trackers. Its combination of robustness and computational efficiency makes RCNNshift an ideal choice for mid to low-end object tracking tasks such as surveillance and underwater tracking. The source code is available at https://github.com/HaoranLiu507/RCNNshift.
引用
收藏
页数:6
相关论文
共 39 条
[1]   Object tracking and detection techniques under GANN threats: A systemic review [J].
Al Jaberi, Saeed Matar ;
Patel, Asma ;
AL-Masri, Ahmed N. .
APPLIED SOFT COMPUTING, 2023, 139
[2]   Target Tracking Using a Mean-Shift Occlusion Aware Particle Filter [J].
Bhat, Pranab Gajanan ;
Subudhi, Badri Narayan ;
Veerakumar, T. ;
Di Caterina, Gaetano ;
Soraghan, John J. .
IEEE SENSORS JOURNAL, 2021, 21 (08) :10112-10121
[3]  
Bochkovskiy A, 2020, Arxiv, DOI arXiv:2004.10934
[4]  
Collins Robert., 2005, IEEE Int. Workshop Perform Eval. Track. Surveill, V2, P35
[5]   Online selection of discriminative tracking features [J].
Collins, RT ;
Liu, YX ;
Leordeanu, M .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (10) :1631-1643
[6]   Kernel-based object tracking [J].
Comaniciu, D ;
Ramesh, V ;
Meer, P .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2003, 25 (05) :564-577
[7]   Mean shift: A robust approach toward feature space analysis [J].
Comaniciu, D ;
Meer, P .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2002, 24 (05) :603-619
[8]   Feature Linking via Synchronization among Distributed Assemblies: Simulations of Results from Cat Visual Cortex [J].
Eckhorn, R. ;
Reitboeck, H. J. ;
Arndt, M. ;
Dicke, P. .
NEURAL COMPUTATION, 1990, 2 (03) :293-307
[9]  
Exner D., 2010, Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on, P9
[10]   MeanShift plus plus : Extremely Fast Mode-Seeking With Applications to Segmentation and Object Tracking [J].
Jang, Jennifer ;
Jiang, Heinrich .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :4100-4111