State-Driven Particle Filter for Multi-person Tracking

被引:0
|
作者
Geronimo Gomez, David [1 ,2 ]
Lerasle, Frederic [3 ,4 ]
Lopez Pena, Antonio M. [1 ,2 ]
机构
[1] Campus Univ Autonoma Barcelona, Comp Vis Ctr, Bellaterra 08193, Spain
[2] Campus Univ Autonoma Barcelona, Dept Comp Sci, Bellaterra 08193, Spain
[3] CNRS, LAAS, F-31077 Toulouse, France
[4] Univ Toulouse UPS, F-31077 Toulouse, France
来源
ADVANCED CONCEPTS FOR INTELLIGENT VISION SYSTEMS (ACIVS 2012) | 2012年 / 7517卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-person tracking can be exploited in applications such as driver assistance, surveillance, multimedia and human-robot interaction. With the help of human detectors, particle filters offer a robust method able to filter noisy detections and provide temporal coherence. However, some traditional problems such as occlusions with other targets or the scene, temporal drifting or even the lost targets detection are rarely considered, making the systems performance decrease. Some authors propose to overcome these problems using heuristics not explained and formalized in the papers, for instance by defining exceptions to the model updating depending on tracks overlapping. In this paper we propose to formalize these events by the use of a state-graph, defining the current state of the track (e. g., potential, tracked, occluded or lost) and the transitions between states in an explicit way. This approach has the advantage of linking track actions such as the online underlying models updating, which gives flexibility to the system. It provides an explicit representation to adapt the multiple parallel trackers depending on the context, i. e., each track can make use of a specific filtering strategy, dynamic model, number of particles, etc. depending on its state. We implement this technique in a single-camera multi-person tracker and test it in public video sequences.
引用
收藏
页码:467 / 478
页数:12
相关论文
共 50 条
  • [41] Large Scale Real-World Multi-person Tracking
    Shuai, Bing
    Bergamo, Alessandro
    Buechler, Uta
    Berneshawi, Andrew
    Boden, Alyssa
    Tighe, Joseph
    COMPUTER VISION, ECCV 2022, PT VIII, 2022, 13668 : 504 - 521
  • [42] PROBABILISTIC MULTI-PERSON TRACKING USING DYNAMIC BAYES NETWORKS
    Klinger, T.
    Rottensteiner, F.
    Heipke, C.
    ISPRS GEOSPATIAL WEEK 2015, 2015, II-3 (W5): : 435 - 442
  • [43] Real-time multi-person tracking in video surveillance
    Niu, W
    Jiao, L
    Han, D
    Wang, YF
    ICICS-PCM 2003, VOLS 1-3, PROCEEDINGS, 2003, : 1144 - 1148
  • [44] Online Multi-Person Tracking using Integral Channel Features
    Kieritz, Hilke
    Becker, Stefan
    Huebner, Wolfgang
    Arens, Michael
    2016 13TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED VIDEO AND SIGNAL BASED SURVEILLANCE (AVSS), 2016, : 122 - 130
  • [45] Single Camera Multi-person Tracking Based on Crowd Simulation
    Jin, Zhixing
    Bhanu, Bir
    2012 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR 2012), 2012, : 3660 - 3663
  • [46] Hierarchical Online Multi-person Pose Tracking with Multiple Cues
    Xu, Chuanzhi
    Zhou, Yue
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT VI, 2018, 11306 : 318 - 328
  • [47] ONLINE MULTI-PERSON TRACKING VIA ROBUST COLLABORATIVE MODEL
    Naiel, Mohamed A.
    Ahmad, M. Omair
    Swamy, M. N. S.
    Wu, Yi
    Yang, Ming-Hsuan
    2014 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2014, : 431 - 435
  • [48] Multi-Person Tracking by Discriminative Affinity Model and Hierarchical Association
    Li, Minghua
    Liu, Zhengxi
    Xiong, Yunyu
    Li, Zheng
    PROCEEDINGS OF 2017 3RD IEEE INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATIONS (ICCC), 2017, : 1741 - 1745
  • [49] Joint multi-person detection and tracking from overlapping cameras
    Liem, Martijn C.
    Gavrila, Dariu M.
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2014, 128 : 36 - 50
  • [50] A Multi-Person Collaborative Design Method Driven by Augmented Reality
    Gao, Liqun
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (11) : 976 - 984