Kinematic joint descriptor and depth motion descriptor with convolutional neural networks for human action recognition

被引:17
作者
Rani, S. Sandhya [1 ]
Naidu, G. Apparao [2 ]
Shree, V. Usha [3 ]
机构
[1] JNTUH, Dept CSE, Malla Reddy Engn Coll Autonomous, Hyderabad, Telangana, India
[2] Vignans Inst Management & Technol Women, Dept CSE, Kondapur, Telangana, India
[3] Joginpally BR Engn Coll, Dept ECE, Hyderabad, Telangana, India
关键词
Human action recognition; Skeleton joints; Depth maps; Kinematic joint descriptors; Convolutional neural network; Fusion; Accuracy;
D O I
10.1016/j.matpr.2020.09.052
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Human Action Recognition has gained a huge research interest due to its widespread applications in various fields. However, due to several challenges like noisy and occluded data, view-point variations, body sizes etc., still the action recognition remains a challenging task. Most of the existing action recognition methods focused on the single data type thereby the recognition system has limited performance. To improve the recognition performance, we have modeled a new approach for human action recognition from two different data types; they are depth images and skeleton joints. Two different descriptors are developed for action representation; they are Differential Depth Motion History Image for depth maps and Motion Kinematic Joint Descriptor for skeleton joints. To attain a discriminative feature set, we have trained three different Convolutional Neural Network Models and the results are fused for final action classification. Simulation is carried out over two public datasets and the obtained results indicate that the proposed approach outperforms state-of-art methods. (C) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页码:3164 / 3173
页数:10
相关论文
共 50 条
  • [21] Human Action Recognition based on Convolutional Neural Networks with a Convolutional Auto-Encoder
    Geng, Chi
    Song, JianXin
    PROCEEDINGS OF THE 2015 5TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCES AND AUTOMATION ENGINEERING, 2016, 42 : 933 - 938
  • [22] Human action recognition using genetic algorithms and convolutional neural networks
    Ijjina, Earnest Paul
    Chalavadi, Krishna Mohan
    PATTERN RECOGNITION, 2016, 59 : 199 - 212
  • [23] Recognition of Human Activities Using Depth Maps and the Viewpoint Feature Histogram Descriptor
    Sidor, Kamil
    Wysocki, Marian
    SENSORS, 2020, 20 (10)
  • [24] Real-time action detection in video surveillance using a sub-action descriptor with multi-convolutional neural networks
    Jin C.-B.
    Do T.D.
    Liu M.
    Kim H.
    Journal of Institute of Control, Robotics and Systems, 2018, 24 (03) : 298 - 308
  • [25] 3D human action analysis and recognition through GLAC descriptor on 2D motion and static posture images
    Bulbul, Mohammad Farhad
    Islam, Saiful
    Ali, Hazrat
    MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (15) : 21085 - 21111
  • [26] Towards Improved Human Action Recognition Using Convolutional Neural Networks and Multimodal Fusion of Depth and Inertial Sensor Data
    Ahmad, Zeeshan
    Khan, Naimul Mefraz
    2018 IEEE INTERNATIONAL SYMPOSIUM ON MULTIMEDIA (ISM 2018), 2018, : 223 - 230
  • [27] Enhanced Depth Motion Maps for Improved Human Action Recognition from Depth Action Sequences
    Rao, Dustakar Surendra
    Rao, L. Koteswara
    Bhagyaraju, Vipparthi
    Meng, Goh Kam
    TRAITEMENT DU SIGNAL, 2024, 41 (03) : 1461 - 1472
  • [28] Human action recognition using Lie Group features and convolutional neural networks
    Cai, Linqin
    Liu, Chengpeng
    Yuan, Rongdi
    Ding, Heen
    NONLINEAR DYNAMICS, 2020, 99 (04) : 3253 - 3263
  • [29] Stratified pooling based deep convolutional neural networks for human action recognition
    Yu, Sheng
    Cheng, Yun
    Su, Songzhi
    Cai, Guorong
    Li, Shaozi
    MULTIMEDIA TOOLS AND APPLICATIONS, 2017, 76 (11) : 13367 - 13382
  • [30] HUMAN ACTIVITY DETECTION AND ACTION RECOGNITION IN VIDEOS USING CONVOLUTIONAL NEURAL NETWORKS
    Basavaiah, Jagadeesh
    Patil, Chandrashekar Mohan
    JOURNAL OF INFORMATION AND COMMUNICATION TECHNOLOGY-MALAYSIA, 2020, 19 (02): : 157 - 183