Inception inspired CNN-GRU hybrid network for human activity recognition

被引:87
作者
Dua, Nidhi [1 ]
Singh, Shiva Nand [1 ]
Semwal, Vijay Bhaskar [2 ]
Challa, Sravan Kumar [1 ]
机构
[1] Natl Inst Technol Jamshedpur, Dept ECE, Jamshedpur, Jharkhand, India
[2] Maulana Azad Natl Inst Technol, Dept CSE, Bhopal, MP, India
关键词
Convolutional neural network; HAR; Inception; Gated recurrent unit; Wearable sensors; Human-computer interaction; Pattern recognition; TIME-SERIES; SYSTEM; MODEL;
D O I
10.1007/s11042-021-11885-x
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Human Activity Recognition (HAR) involves the recognition of human activities using sensor data. Most of the techniques for HAR involve hand-crafted features and hence demand a good amount of human intervention. Moreover, the activity data obtained from sensors are highly imbalanced and hence demand a robust classifier design. In this paper, a novel classifier "ICGNet" is proposed for HAR, which is a hybrid of Convolutional Neural Network (CNN) and Gated Recurrent Unit (GRU). The CNN block used in the proposed network derives its inspiration from the famous Inception module. It uses multiple-sized convolutional filters simultaneously over the input and thus can capture the information in the data at multiple scales. These multi-sized filters introduced at the same level in the convolution network helps to compute more abstract features for local patches of data. It also makes use of 1 x 1 convolution to pool the input across channel dimension, and the intuition behind it is that it helps the model extract the valuable information hidden across the channels. The proposed ICGNet leverages the strengths of CNN and GRU and hence can capture local features and long-term dependencies in the multivariate time series data. It is an end-to-end model for HAR that can process raw data captured from wearable sensors without using any manual feature engineering. Integrating the adaptive user interfaces, the proposed HAR system can be applied to Human-Computer Interaction (HCI) fields such as interactive games, robot learning, health monitoring, and pattern-based surveillance. The overall accuracies achieved on two benchmark datasets viz. MHEALTH and PAMAP2 are 99.25% and 97.64%, respectively. The results indicate that the proposed network outperformed the similar architectures proposed for HAR in the literature.
引用
收藏
页码:5369 / 5403
页数:35
相关论文
共 71 条
[41]   Combining emerging patterns with random forest for complex activity recognition in smart homes [J].
Malazi, Hadi Tabatabaee ;
Davari, Mohammad .
APPLIED INTELLIGENCE, 2018, 48 (02) :315-330
[42]  
Meng Y, 2018, P 56 ANN M ASS COMP, V1
[43]  
Mutegeki Ronald, 2020, 2020 International Conference on Artificial Intelligence in Information and Communication (ICAIIC), P362, DOI 10.1109/ICAIIC48513.2020.9065078
[44]   Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition [J].
Ordonez, Francisco Javier ;
Roggen, Daniel .
SENSORS, 2016, 16 (01)
[45]   Deep learning based image classification for intestinal hemorrhage [J].
Pannu, Husanbir Singh ;
Ahuja, Sahil ;
Dang, Nitin ;
Soni, Sahil ;
Malhi, Avleen Kaur .
MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (29-30) :21941-21966
[46]   BEGAN v3: Avoiding Mode Collapse in GANs Using Variational Inference [J].
Park, Sung-Wook ;
Huh, Jun-Ho ;
Kim, Jong-Chan .
ELECTRONICS, 2020, 9 (04)
[47]   Segmentation and classification of brain tumors using modified median noise filter and deep learning approaches [J].
Ramesh, S. ;
Sasikala, S. ;
Paramanandham, Nirmala .
MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (08) :11789-11813
[48]  
Rautaray S.S., 2012, 2012 IEEE International Conference on Technology Enhanced Education (ICTEE), P1, DOI DOI 10.1109/ICTEE.2012.6208628
[49]   Introducing a New Benchmarked Dataset for Activity Monitoring [J].
Reiss, Attila ;
Stricker, Didier .
2012 16TH INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS (ISWC), 2012, :108-109
[50]   iSPLInception: An Inception-ResNet Deep Learning Architecture for Human Activity Recognition [J].
Ronald, Mutegeki ;
Poulose, Alwin ;
Han, Dong Seog .
IEEE ACCESS, 2021, 9 :68985-69001