Linear latent low dimensional space for online early action recognition and prediction

被引:26
作者
Bloom, Victoria [1 ]
Argyriou, Vasileios [2 ]
Makris, Dimitrios [2 ]
机构
[1] Coventry Univ, Fac Engn Environm & Comp, Gulson Rd, Coventry CV1 2JH, W Midlands, England
[2] Kingston Univ, Digital Informat Res Ctr, Penrhyn Rd, Kingston Upon Thames KT1 2EE, Surrey, England
关键词
Action recognition; Action prediction; Dimensionality reduction; REPRESENTATION; SELECTION; MODEL;
D O I
10.1016/j.patcog.2017.07.003
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recognition and prediction of human actions is one of the important tasks in various computer vision applications including video surveillance, human computer interaction and home entertainment that require online and real time approaches. In this work, we propose a novel approach that utilises continuous streams of joint motion data for recognising and predicting actions in linear latent spaces operating online and in real time. Our approach is based on supervised learning and dimensionality reduction techniques that allow the representation of high dimensional nonlinear actions to linear latent low dimensional spaces. Our methodology has been evaluated using well-known datasets and performance metrics specifically designed for online and real time action recognition and prediction. We demonstrate the performance of the proposed approach in a comparative study showing high accuracy and low latency. (C) 2017 Elsevier Ltd. All rights reserved.
引用
收藏
页码:532 / 547
页数:16
相关论文
共 51 条
[1]   Evolutionary joint selection to improve human action recognition with RGB-D devices [J].
Andre Chaaraoui, Alexandros ;
Ramon Padilla-Lopez, Jose ;
Climent-Perez, Pau ;
Florez-Revuelta, Francisco .
EXPERT SYSTEMS WITH APPLICATIONS, 2014, 41 (03) :786-794
[2]  
[Anonymous], 2013, IEEE T PATTERN ANAL, DOI DOI 10.1109/TPAMI.2012.59
[3]  
[Anonymous], 14 EUR C COMP VIS EC
[4]  
[Anonymous], 2010, BRIEF INTRO COUPLING
[5]  
[Anonymous], P 18 INT C GEOINF
[6]  
[Anonymous], ACTION POINTS REPRES
[7]  
[Anonymous], P 2013 C COMP VIS PA
[8]  
[Anonymous], 2008, BMVC
[9]  
[Anonymous], FUNDAMENTALS APPL PR
[10]  
[Anonymous], P 2103 IEEE WINT C A