CapsGaNet: Deep Neural Network Based on Capsule and GRU for Human Activity Recognition

被引:26
作者
Sun, Xiaojie [1 ]
Xu, Hongji [1 ]
Dong, Zheng [1 ]
Shi, Leixin [1 ]
Liu, Qiang [1 ]
Li, Juan [1 ]
Li, Tiankuo [1 ]
Fan, Shidi [1 ]
Wang, Yuhao [1 ]
机构
[1] Shandong Univ, Sch Informat Sci & Engn, Qingdao 266237, Peoples R China
来源
IEEE SYSTEMS JOURNAL | 2022年 / 16卷 / 04期
关键词
Feature extraction; Deep learning; Convolutional neural networks; Activity recognition; Convolution; Sensors; Kernel; Aggressive activity; deep learning; human activity recognition (HAR); spatiotemporal feature; WEARABLE SENSOR;
D O I
10.1109/JSYST.2022.3153503
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The advances in deep learning with the ability to automatically extract advanced features have achieved a bright prospect for human activity recognition (HAR). However, the traditional HAR methods still have the deficiencies of incomplete feature extraction, which may lead to incorrect recognition results. To resolve the above problem, a novel framework for spatiotemporal multi-feature extraction on HAR called CapsGaNet is propounded, which is based on capsule and gated recurrent units (GRU) with attention mechanisms. The proposed framework involves a spatial feature extraction layer consisting of capsule blocks, a temporal feature extraction layer consisting of GRU with attention mechanisms, and an output layer. At the same time, considering the actual demands for recognizing aggressive activities in some specific scenarios like smart prison, we constructed a daily and aggressive activity dataset (DAAD). Moreover, based on the acceleration characteristics of aggressive activity, a threshold-based approach for aggressive activity detection is propounded to meet the needs of high real-time and low computational complexity in prison scenarios. The experiments are performed on the wireless sensor data mining (WISDM) dataset and the DAAD dataset, and the results verify that the propounded CapsGaNet could effectually improve the recognition accuracy. The proposed threshold-based approach for aggressive activity detection provides a more effective HAR way by using smart sensor devices in smart prison scenarios.
引用
收藏
页码:5845 / 5855
页数:11
相关论文
共 46 条
[1]   A standard testing and calibration procedure for low cost MEMS inertial sensors and units [J].
Aggarwal, P. ;
Syed, Z. ;
Niu, X. ;
El-Sheimy, N. .
JOURNAL OF NAVIGATION, 2008, 61 (02) :323-336
[2]  
[Anonymous], 2017, Capsule Network Performance on Complex Data
[3]   IoT Wearable Sensor and Deep Learning: An Integrated Approach for Personalized Human Activity Recognition in a Smart Home Environment [J].
Bianchi, Valentina ;
Bassoli, Marco ;
Lombardo, Gianfranco ;
Fornacciari, Paolo ;
Mordonini, Monica ;
De Munari, Ilaria .
IEEE INTERNET OF THINGS JOURNAL, 2019, 6 (05) :8553-8562
[4]   SCA-CNN: Spatial and Channel-wise Attention in Convolutional Networks for Image Captioning [J].
Chen, Long ;
Zhang, Hanwang ;
Xiao, Jun ;
Nie, Liqiang ;
Shao, Jian ;
Liu, Wei ;
Chua, Tat-Seng .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :6298-6306
[5]  
Chen RL, 2018, 2018 21ST INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), P565, DOI 10.23919/ICIF.2018.8455227
[6]  
Chen YW, 2016, ADV INTEL SYS RES, V127
[7]  
Cho K., 2014, ARXIV14061078, DOI [10.48550/arXiv.1406.1078, DOI 10.3115/V1/D14-1179]
[8]   SensCapsNet: Deep Neural Network for Non-Obtrusive Sensing Based Human Activity Recognition [J].
Cuong Pham ;
Son Nguyen-Thai ;
Huy Tran-Quang ;
Son Tran ;
Hai Vu ;
Thanh-Hai Tran ;
Thi-Lan Le .
IEEE ACCESS, 2020, 8 (08) :86934-86946
[9]   Sensor-based and vision-based human activity recognition: A comprehensive survey [J].
Dang, L. Minh ;
Min, Kyungbok ;
Wang, Hanxiang ;
Piran, Md. Jalil ;
Lee, Cheol Hee ;
Moon, Hyeonjoon .
PATTERN RECOGNITION, 2020, 108 (108)
[10]  
Duarte K, 2018, 32 C NEURAL INFORM P