HAR-DeepConvLG: Hybrid deep learning-based model for human activity recognition in IoT applications

被引:20
作者
Ding, Weiping [1 ,2 ]
Abdel-Basset, Mohamed [3 ]
Mohamed, Reda [3 ]
机构
[1] Nantong Univ, Sch Informat Sci & Technol, Nantong 226019, Peoples R China
[2] City Univ Macau, Fac Data Sci, Macau, Peoples R China
[3] Zagazig Univ, Fac Comp & Informat, Zagazig 44519, Egypt
基金
中国国家自然科学基金;
关键词
Internet of Things; Long short-term memory; Convolution; Human activity recognition; Gated recurrent unit; NETWORK; CLASSIFICATION; ARCHITECTURE;
D O I
10.1016/j.ins.2023.119394
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Smartphones and wearable devices have built-in sensors that can collect multivariant time-series data that can be used to recognize human activities. Research on human activity recognition (HAR) has gained significant attention in recent years due to its growing demand in various application domains. As wearable sensor-aided devices and the Internet of Things (IoT) became more common, great attention has been paid to the HAR ubiquitous computing and mobile computing. To infer human activity data from a massive amount of multivariant data generated by different wearable devices, in this study an innovative deep learning-based model named HAR-DeepConvLG is proposed. It includes three convolution layers and a squeezing and excitation (SE) block, which are employed to precisely learn and extract the spatial representation data from the collected raw sensor data. The extracted features are used as input of three parallel paths, each of which includes a long short-term memory (LSTM) layer connected in sequence with a gated recurrent unit (GRU) layer to learn temporal representation. The three paths are connected in parallel to avoid the vanishing gradient problem. Finally, to evaluate the effectiveness of the proposed model, experiments were conducted on four widely utilized HAR datasets. Additionally, the model's performance was compared to several state-of-the-art deep learning models, which further validated its effectiveness. The experimental results show that the proposed HAR-DeepConvLG model performs better than the existing HAR deep learning-based models, achieving a competitive classification accuracy.
引用
收藏
页数:22
相关论文
共 41 条
[1]   ST-DeepHAR: Deep Learning Model for Human Activity Recognition in IoHT Applications [J].
Abdel-Basset, Mohamed ;
Hawash, Hossam ;
Chakrabortty, Ripon K. ;
Ryan, Michael ;
Elhoseny, Mohamed ;
Song, Houbing .
IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (06) :4969-4979
[2]   Intelligent Driver Drowsiness Detection for Traffic Safety Based on Multi CNN Deep Model and Facial Subsampling [J].
Ahmed, Muneeb ;
Masood, Sarfaraz ;
Ahmad, Musheer ;
Abd El-Latif, Ahmed A. .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (10) :19743-19752
[3]  
Cho K., 2014, ARXIV, DOI [10.3115/v1/W14-4012, DOI 10.3115/V1/W14-4012]
[4]   Inception inspired CNN-GRU hybrid network for human activity recognition [J].
Dua, Nidhi ;
Singh, Shiva Nand ;
Semwal, Vijay Bhaskar ;
Challa, Sravan Kumar .
MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (04) :5369-5403
[5]   Deep Neural Networks for Sensor-Based Human Activity Recognition Using Selective Kernel Convolution [J].
Gao, Wenbin ;
Zhang, Lei ;
Huang, Wenbo ;
Min, Fuhong ;
He, Jun ;
Song, Aiguo .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2021, 70
[6]   Evolutionary Dual-Ensemble Class Imbalance Learning for Human Activity Recognition [J].
Guo, Yinan ;
Chu, Yaoqi ;
Jiao, Botao ;
Cheng, Jian ;
Yu, Zekuan ;
Cui, Ning ;
Ma, Lianbo .
IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2022, 6 (04) :728-739
[7]   Feature Selection and Activity Recognition System Using a Single Triaxial Accelerometer [J].
Gupta, Piyush ;
Dallas, Tim .
IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2014, 61 (06) :1780-1786
[8]  
Gupta S., 2021, Int. J. Inf. Manage. Data Insights, V1, P100046, DOI DOI 10.1016/J.JJIMEI.2021.100046
[9]  
Hu J, 2018, PROC CVPR IEEE, P7132, DOI [10.1109/TPAMI.2019.2913372, 10.1109/CVPR.2018.00745]
[10]  
Huang W., 2022, IEEE T MOBILE COMPUT, DOI DOI 10.1109/TMC.2022.3174816