Accurate Hierarchical Human Actions Recognition From Kinect Skeleton Data

被引:23
|
作者
Su, Benyue [1 ,2 ]
Wu, Huang [1 ,2 ]
Sheng, Min [2 ,3 ]
Shen, Chuansheng [2 ,3 ]
机构
[1] Anqing Normal Univ, Sch Comp & Informat, Anqing 246133, Peoples R China
[2] Intelligent Percept & Comp Key Lab Anhui Prov, Anqing 246133, Anhui, Peoples R China
[3] Anqing Normal Univ, Sch Math & Computat Sci, Anqing 246133, Peoples R China
来源
IEEE ACCESS | 2019年 / 7卷
关键词
Activity recognition; statistical learning; supervised learning; REHABILITATION;
D O I
10.1109/ACCESS.2019.2911705
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Human action recognition has become one of the most active research topics in natural human interaction and artificial intelligence, and has attracted much attention. Human movement ranges from simple to complex, from low-level to advanced, with an increasing degree of complexity and data noise. In other words, there is a complicated hierarchy in movement actions. Hierarchy theory can efficiently describe these complicated hierarchical relationships of human actions. Accordingly, a hierarchical framework for human-action recognition is designed in this paper. Different features are selected according to the level of action, and specific classifiers are selected for different features. In particular, a two-level hierarchical recognition framework is constructed and tested on Kinect skeleton data. At the first level, we use support vector machine for a coarse-grained classification, while at the second level we use a combination of support vector machine and a hidden Markov model for a fine-grained classification. Ten-fold cross-validations are used in our performance evaluation on public and self-built datasets, achieving average recognition rates of 95.69% and 97.64%, respectively. These outstanding results imply that the hierarchical step-wise precise classification can well reflect the inherent process of human action.
引用
收藏
页码:52532 / 52541
页数:10
相关论文
共 50 条
  • [41] Sparse Modeling of Human Actions from Motion Imagery
    Castrodad, Alexey
    Sapiro, Guillermo
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2012, 100 (01) : 1 - 15
  • [42] Sparse Modeling of Human Actions from Motion Imagery
    Alexey Castrodad
    Guillermo Sapiro
    International Journal of Computer Vision, 2012, 100 : 1 - 15
  • [43] Feature extraction for human activity recognition on streaming data
    Yala, Nawel
    Fergani, Belkacem
    Fleury, Anthony
    2015 INTERNATIONAL SYMPOSIUM ON INNOVATIONS IN INTELLIGENT SYSTEMS AND APPLICATIONS (INISTA) PROCEEDINGS, 2015, : 262 - 267
  • [44] Overview of Human Activity Recognition Using Sensor Data
    Hamad, Rebeen Ali
    Woo, Wai Lok
    Wei, Bo
    Yang, Longzhi
    ADVANCES IN COMPUTATIONAL INTELLIGENCE SYSTEMS, UKCI 2022, 2024, 1454 : 380 - 391
  • [45] Human Activity Recognition Using Ambient Sensor Data
    Aida, Skamo
    Kevric, Jasmin
    IFAC PAPERSONLINE, 2022, 55 (04): : 97 - 102
  • [46] Discovery and Recognition of Emerging Human Activities Using a Hierarchical Mixture of Directional Statistical Models
    Fang, Lei
    Ye, Juan
    Dobson, Simon
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2020, 32 (07) : 1304 - 1316
  • [47] Hierarchical evolutionary classification framework for human action recognition using sparse dictionary optimization
    Jansi, R.
    Amutha, R.
    SWARM AND EVOLUTIONARY COMPUTATION, 2021, 63
  • [48] HL-HAR: Hierarchical Learning Based Human Activity Recognition in Wearable Computing
    Liu, Yan
    Zhao, Wentao
    Liu, Qiang
    Yu, Linyuan
    Wang, Dongxu
    CLOUD COMPUTING AND SECURITY, PT II, 2017, 10603 : 684 - 693
  • [49] Spatial Hard Attention Modeling via Deep Reinforcement Learning for Skeleton-Based Human Activity Recognition
    Nikpour, Bahareh
    Armanfard, Narges
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2023, 53 (07): : 4291 - 4301
  • [50] Human Activity Recognition Using Thigh Angle Derived from Single Thigh Mounted IMU Data
    Abhayasinghe, Nimsiri
    Murray, Iain
    2014 INTERNATIONAL CONFERENCE ON INDOOR POSITIONING AND INDOOR NAVIGATION (IPIN), 2014, : 111 - 115