Tracking in object action space

被引:3
|
作者
Kruger, Volker [1 ]
Herzog, Dennis [1 ]
机构
[1] Aalborg Univ, Dept Mech Engn & Prod, Aalborg, Denmark
关键词
Action recognition; Parametric gestures; Tracking; Pose estimation; HIDDEN MARKOV-MODELS; MOTION CAPTURE; RECOGNITION; POSE; PEOPLE;
D O I
10.1016/j.cviu.2013.02.002
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we focus on the joint problem of tracking humans and recognizing human action in scenarios such as a kitchen scenario or a scenario where a robot cooperates with a human, e.g., for a manufacturing task. In these scenarios, the human directly interacts with objects physically by using/manipulating them or by, e.g., pointing at them such as in "Give me that ... ". To recognize these types of human actions is difficult because (a) they ought to be recognized independent of scene parameters such as viewing direction and (b) the actions are parametric, where the parameters are either object-dependent or as, e.g., in the case of a pointing direction convey important information. One common way to achieve recognition is by using 3D human body tracking followed by action recognition based on the captured tracking data. For the kind of scenarios considered here we would like to argue that 3D body tracking and action recognition should be seen as an intertwined problem that is primed by the objects on which the actions are applied. In this paper, we are looking at human body tracking and action recognition from a object-driven perspective. Instead of the space of human body poses we consider the space of the object affordances, i.e., the space of possible actions that are applied on a given object. This way, 3D body tracking reduces to action tracking in the object (and context) primed parameter space of the object affordances. This reduces the high-dimensional joint-space to a low-dimensional action space. In our approach, we use parametric hidden Markov models to represent parametric movements; particle filtering is used to track in the space of action parameters. We demonstrate its effectiveness on synthetic and on real image sequences using human-upper body single arm actions that involve objects. (C) 2013 Elsevier Inc. All rights reserved.
引用
收藏
页码:764 / 789
页数:26
相关论文
共 50 条
  • [1] Space Object Tracking (SPOT) Facility
    Shivitz, Robert
    Kendrick, Richard
    Mason, James
    Bold, Matthew
    Kubo, Tracy
    Bock, Kevin
    Tyler, David
    GROUND-BASED AND AIRBORNE TELESCOPES V, 2014, 9145
  • [2] Tracking an object through feature space
    Erik Blaser
    Zenon W. Pylyshyn
    Alex O. Holcombe
    Nature, 2000, 408 : 196 - 199
  • [3] Tracking an object through feature space
    Blaser, E
    Pylyshyn, ZW
    Holcombe, AO
    NATURE, 2000, 408 (6809) : 196 - 199
  • [4] Space Object Tracking with Delayed Measurements
    Chen, Huimin
    Shen, Dan
    Chen, Genshe
    Blasch, Erik
    Pham, Khanh
    SPACE MISSIONS AND TECHNOLOGIES, 2010, 7691
  • [5] Does action disrupt multiple object tracking?
    Thornton, I. M.
    Horowitz, T. S.
    Buelthoff, H. H.
    PERCEPTION, 2014, 43 (10) : 1128 - 1128
  • [6] A New Method of Tracking Group Space Object
    Zhang Zi-xu
    Zhang Wei
    Chen Ming-yan
    2013 INTERNATIONAL WORKSHOP ON MICROWAVE AND MILLIMETER WAVE CIRCUITS AND SYSTEM TECHNOLOGY (MMWCST), 2013, : 403 - 406
  • [7] MULTIPLE OBJECT TRACKING USING A TRANSFORM SPACE
    Li, Minglei
    Li, Jiasong
    Tamayo, Alexis
    Nan, Liangliang
    XXIV ISPRS CONGRESS IMAGING TODAY, FORESEEING TOMORROW, COMMISSION IV, 2022, 5-4 : 137 - 143
  • [8] DVS Benchmark Datasets for Object Tracking, Action Recognition, and Object Recognition
    Hu, Yuhuang
    Liu, Hongjie
    Pfeiffer, Michael
    Delbruck, Tobi
    FRONTIERS IN NEUROSCIENCE, 2016, 10
  • [9] SPACE AS AN INTERFACE IN THE RELATION OF CONCEPT ACTION OBJECT
    Arayici, Osman
    TURKISH ONLINE JOURNAL OF DESIGN ART AND COMMUNICATION, 2018, 8 (01): : 97 - 103
  • [10] Space object tracking with perturbation in space-based optical surveillance
    School of Electronic Science and Engineering, National University of Defense Technology, Changsha 410073, China
    Dianzi Yu Xinxi Xuebao, 2009, 9 (2088-2092):