Deep Residual Network with a CBAM Mechanism for the Recognition of Symmetric and Asymmetric Human Activity Using Wearable Sensors

被引:7
作者
Mekruksavanich, Sakorn [1 ]
Jitpattanakul, Anuchit [2 ,3 ]
机构
[1] Univ Phayao, Sch Informat & Commun Technol, Dept Comp Engn, Phayao 56000, Thailand
[2] King Mongkuts Univ Technol North Bangkok, Fac Appl Sci, Dept Math, Bangkok 10800, Thailand
[3] King Mongkuts Univ Technol North Bangkok, Intelligent & Nonlinear Dynam Innovat Res Ctr, Sci & Technol Res Inst, Bangkok 10800, Thailand
来源
SYMMETRY-BASEL | 2024年 / 16卷 / 05期
关键词
human activity recognition; wearable sensor; symmetric human activity; deep learning; deep residual network; INTERNET;
D O I
10.3390/sym16050554
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Wearable devices are paramount in health monitoring applications since they provide contextual information to identify and recognize human activities. Although sensor-based human activity recognition (HAR) has been thoroughly examined, prior studies have yet to definitively differentiate between symmetric and asymmetric motions. Determining these movement patterns might provide a more profound understanding of assessing physical activity. The main objective of this research is to investigate the use of wearable motion sensors and deep convolutional neural networks in the analysis of symmetric and asymmetric activities. This study provides a new approach for classifying symmetric and asymmetric motions using a deep residual network incorporating channel and spatial convolutional block attention modules (CBAMs). Two publicly accessible benchmark HAR datasets, which consist of inertial measurements obtained from wrist-worn sensors, are used to assess the model's efficacy. The model we have presented is subjected to thorough examination and demonstrates exceptional accuracy on both datasets. The ablation experiment examination also demonstrates noteworthy contributions from the residual mappings and CBAMs. The significance of recognizing basic movement symmetries in increasing sensor-based activity identification utilizing wearable devices is shown by the enhanced accuracy and F1-score, especially in asymmetric activities. The technique under consideration can provide activity monitoring with enhanced accuracy and detail, offering prospective advantages in diverse domains like customized healthcare, fitness tracking, and rehabilitation progress evaluation.
引用
收藏
页数:26
相关论文
共 49 条
[1]   On the Use of a Convolutional Block Attention Module in Deep Learning-Based Human Activity Recognition with Motion Sensors [J].
Agac, Sumeyye ;
Incel, Ozlem Durmaz .
DIAGNOSTICS, 2023, 13 (11)
[2]   Smartphone Motion Sensor-Based Complex Human Activity Identification Using Deep Stacked Autoencoder Algorithm for Enhanced Smart Healthcare System [J].
Alo, Uzoma Rita ;
Nweke, Henry Friday ;
Teh, Ying Wah ;
Murtaza, Ghulam .
SENSORS, 2020, 20 (21) :1-28
[3]  
Anguita D., 2013, ESANN, P437
[4]  
Ba J, 2014, ACS SYM SER
[5]   Window Size Impact in Human Activity Recognition [J].
Banos, Oresti ;
Galvez, Juan-Manuel ;
Damas, Miguel ;
Pomares, Hector ;
Rojas, Ignacio .
SENSORS, 2014, 14 (04) :6474-6499
[6]   How Validation Methodology Influences Human Activity Recognition Mobile Systems [J].
Braganca, Hendrio ;
Colonna, Juan G. ;
Oliveira, Horacio A. B. F. ;
Souto, Eduardo .
SENSORS, 2022, 22 (06)
[7]   Improving the Performance and Explainability of Indoor Human Activity Recognition in the Internet of Things Environment [J].
Cengiz, Ayse Betul ;
Birant, Kokten Ulas ;
Cengiz, Mehmet ;
Birant, Derya ;
Baysari, Kemal .
SYMMETRY-BASEL, 2022, 14 (10)
[8]   A multibranch CNN-BiLSTM model for human activity recognition using wearable sensor data [J].
Challa, Sravan Kumar ;
Kumar, Akhilesh ;
Semwal, Vijay Bhaskar .
VISUAL COMPUTER, 2022, 38 (12) :4095-4109
[9]   Deep learning based multimodal complex human activity recognition using wearable devices [J].
Chen, Ling ;
Liu, Xiaoze ;
Peng, Liangying ;
Wu, Menghan .
APPLIED INTELLIGENCE, 2021, 51 (06) :4029-4042
[10]  
Cho K, 2014, P SSST 8 8 WORKSH SY, P103, DOI [10.3115/v1/W14-4012, DOI 10.3115/V1/W14-4012]