RGB-D based human action recognition using evolutionary self-adaptive extreme learning machine with knowledge-based control parameters

被引:0
作者
Preksha Pareek
Ankit Thakkar
机构
[1] Institute of Technology,Department of Computer Science and Engineering
[2] Nirma University,undefined
来源
Journal of Ambient Intelligence and Humanized Computing | 2023年 / 14卷
关键词
Human action recognition; Extreme learning machine; Principal component analysis; Self-adaptive differential evolution with knowledge-based control parameters; Depth-based action classification;
D O I
暂无
中图分类号
学科分类号
摘要
Human Action Recognition (HAR) has gained considerable attention due to its various applications such as monitoring activities, robotics, visual surveillance, to name a few. An action recognition task consists of feature extraction, dimensionality reduction, and action classification. The paper proposes an action recognition approach for depth-based input by designing Single Layer Feed forward Network (SLFN) using Self-adaptive Differential Evolution with knowledge-based control parameter-Extreme Learning Machine (SKPDE-ELM). To capture motion cues, we have used Depth Motion Map (DMM) wherein to obtain compact features, Local Binary Pattern (LBP) is applied. Thereafter, for dimensionality reduction, Principal Component Analysis (PCA) is applied to reduce the feature dimensions. For the action classification task, Extreme Learning Machine (ELM) achieves good performance for depth-based input due to its learning speed and good generalization performance. Further, to optimize the performance of ELM classifier, an evolutionary method named SKPDE is used to derive the hidden parameters of ELM classifier. The performance of the proposed approach is compared with the existing approaches Kernel ELM (KELM), L2-Collaborative Representation Classifier (CRC), and Probabilistic CRC (Pro-CRC) using datasets MSRAction3D (with 557 samples), MSRAction3D (with 567 samples), MSRDaily Activity3D, MSRGesture3D, and UTD-MHAD. The proposed approach is also statistically tested using Wilcoxon signed rank-test.
引用
收藏
页码:939 / 957
页数:18
相关论文
共 189 条
  • [31] Min K(2019)Instructor activity recognition through deep spatiotemporal features and feedforward extreme learning machines Math Probl Eng 38 2430-625
  • [32] Wang H(2020)A novel method of human joint prediction in an occlusion scene by using low-cost motion capture technique Sensors 7 1-107
  • [33] Piran MJ(2017)A human activity recognition framework using max-min features and key poses with differential evolution random forests classifier Pattern Recogn Lett 2 50-1266
  • [34] Lee CH(2021)A survey on video-based human action recognition: recent updates, datasets, challenges, and applications Artif Intell Rev 9 1049-645
  • [35] Moon H(2015)A survey of unsupervised techniques for web data extraction Int J Comput Sci 167 616-29
  • [36] Das S(2016)Histogram of oriented principal components for cross-view action recognition IEEE Trans Pattern Anal Mach Intell 96 106684-135130
  • [37] Suganthan PN(2020)Working together: a dbn approach for individual and group activity recognition J Ambient Intell Humaniz Comput 177 114800-105
  • [38] Feng B(2016)Ambient and smartphone sensor assisted adl recognition in multi-inhabitant smart environments J Ambient Intell Humaniz Comput 65 95-undefined
  • [39] He F(2018)Action recognition using multi-directional projected depth motion maps J Ambient Intell Humaniz Comput 12 1249-undefined
  • [40] Wang X(2015)Applications and challenges of human activity recognition using sensors in a smart environment IJIRST Int J Innov Res Sci Technol 167 636-undefined