Angular Features-Based Human Action Recognition System for a Real Application With Subtle Unit Actions

被引:3
作者
Ryu, Jaeyeong [1 ]
Patil, Ashok Kumar [1 ]
Chakravarthi, Bharatesh [1 ]
Balasubramanyam, Adithya [1 ]
Park, Soungsill [1 ]
Chai, Youngho [1 ]
机构
[1] Chung Ang Univ, Grad Sch Adv Imaging Sci, Seoul 06974, South Korea
关键词
Sensors; Benchmark testing; Skeleton; Feature extraction; Data mining; Training data; Sensor systems; Human action recognition; skeleton; motion capture; ELM classifier; surveillance; MOTION; REPRESENTATION; ENSEMBLE;
D O I
10.1109/ACCESS.2022.3144456
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Human action recognition (HAR) technology is receiving considerable attention in the field of human-computer interaction. We present a HAR system that works stably in real-world applications. In real-world applications, the HAR system needs to identify detailed actions for specific purposes, and the action data includes many variations. Accordingly, we conducted three experiments. First, we tested our recognition system's performance on the UTD-MHAD dataset. We compared our system's accuracy with results from previous research and confirmed that our system achieves a 91% average performance among recognition systems. Furthermore, we hypothesized the use of a HAR system to detect burglary. In the second experiment, we compared the existing benchmark data with our crime detection dataset. We recognized the test scenarios' data by using the recognition system trained by each dataset. The recognition system trained by our dataset achieved higher accuracy than the past benchmark dataset. The results show that the training data should contain detailed actions for a real application. In the third experiment, we tried to find the motion data type that stably recognizes action regardless of data variation. In a real application, the action data are changed by people. Thus, we introduced variations in the action data using the cross-subject protocol and moving area setting. We trained the recognition system using each position and angle data. In addition, we compared the accuracy of each system. We found that the angle format results in better accuracy because the angle data are beneficial for converting the action variation into a consistent pattern.
引用
收藏
页码:9645 / 9657
页数:13
相关论文
共 42 条
[1]   An efficient human action recognition framework with pose-based spatiotemporal features [J].
Agahian, Saeid ;
Negin, Farhood ;
Kose, Cemal .
ENGINEERING SCIENCE AND TECHNOLOGY-AN INTERNATIONAL JOURNAL-JESTECH, 2020, 23 (01) :196-203
[2]   Evolutionary joint selection to improve human action recognition with RGB-D devices [J].
Andre Chaaraoui, Alexandros ;
Ramon Padilla-Lopez, Jose ;
Climent-Perez, Pau ;
Florez-Revuelta, Francisco .
EXPERT SYSTEMS WITH APPLICATIONS, 2014, 41 (03) :786-794
[3]  
[Anonymous], 2012, P 20 ACM INT C MULT
[4]   Motion-Sphere: Visual Representation of the Subtle Motion of Human Joints [J].
Balasubramanyam, Adithya ;
Patil, Ashok Kumar ;
Chakravarthi, Bharatesh ;
Ryu, Jae Yeong ;
Chai, Young Ho .
APPLIED SCIENCES-BASEL, 2020, 10 (18)
[5]  
Chen C, 2015, IEEE IMAGE PROC, P168, DOI 10.1109/ICIP.2015.7350781
[6]   Skeleton-based action recognition with extreme learning machines [J].
Chen, Xi ;
Koskela, Markus .
NEUROCOMPUTING, 2015, 149 :387-396
[7]  
Fothergill Simon, 2012, P SIGCHI C HUM FACT, P1737, DOI DOI 10.1145/2207676.2208303
[8]   Actions as space-time shapes [J].
Gorelick, Lena ;
Blank, Moshe ;
Shechtman, Eli ;
Irani, Michal ;
Basri, Ronen .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2007, 29 (12) :2247-2253
[9]   Enhanced Computer Vision with Microsoft Kinect Sensor: A Review [J].
Han, Jungong ;
Shao, Ling ;
Xu, Dong ;
Shotton, Jamie .
IEEE TRANSACTIONS ON CYBERNETICS, 2013, 43 (05) :1318-1334
[10]   Going deeper into action recognition: A survey [J].
Herath, Samitha ;
Harandi, Mehrtash ;
Porikli, Fatih .
IMAGE AND VISION COMPUTING, 2017, 60 :4-21