Sensors-Based Human Activity Recognition Using Hybrid Features and Deep Capsule Network

被引:0
|
作者
Ghafoor, Hafiz Yasir [1 ]
Jahangir, Rashid [2 ]
Jaffar, Arfan [1 ]
Alroobaea, Roobaea [3 ]
Saidani, Oumaima [4 ]
Alhayan, Fatimah [4 ]
机构
[1] Superior Univ, Fac Comp Sci & Informat Technol, Lahore 54600, Pakistan
[2] COMSATS Univ Islamabad, Dept Comp Sci, Vehari Campus, Vehari 61100, Pakistan
[3] Taif Univ, Coll Comp & Informat Technol, Dept Comp Sci, Taif 21944, Saudi Arabia
[4] Princess Nourah Bint Abdulrahman Univ, Coll Comp & Informat Sci, Dept Informat Syst, POB 84428, Riyadh 11671, Saudi Arabia
关键词
Human activity recognition; Sensors; Long short term memory; Data models; Computational modeling; Feature extraction; Context modeling; Deep capsule network (DeepCapsNet); human activity recognition (HAR); Mel frequency cepstral coefficients (MFCCs); multifeatures; wearable sensors;
D O I
10.1109/JSEN.2024.3402314
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
With the continuous advancements in artificial intelligence, human activity recognition (HAR) technologies have garnered widespread attention and found applications across diverse domains. Recently, various features and deep learning (DL) models are proposed for HAR using sensors data. Though the existing models and features have achieved notable performance, their recognition accuracy needs to be enhanced and computational cost needs to be reduced. This research introduces a novel integration of various features and deep capsule network (DeepCapsNet) capable of real-time signal processing. To construct the HAR model, the information about patterns shape and symmetry of the sensor data distribution, and variations in frequency modulation was extracted. This information was given to the DL model (DeepCapsNet) that integrates several convolutional layers (CLs) and DeepCapsNet. The CLs in DeepCapsNet are used to process temporal sequences and deliver scalar outputs, while the capsule network is utilized to retrieve the equivariance which enhances the performance of HAR model. At last, the efficiency of the DeepCapsNet is comprehensively assessed in comparison to other baseline models using three benchmark HAR datasets. The average accuracy of DeepCapsNet for UCI HAR, WISDM, and PAMAP2 datasets are 97.6%, 98.5%, and 99.9%, respectively. This research findings revealed the effectiveness of the DeepCapsNet over the performance of the baseline models in terms of accuracy and computational cost. The feature selection and model optimization need to be further explored to enhance the performance of HAR model.
引用
收藏
页码:23129 / 23139
页数:11
相关论文
共 50 条
  • [21] Deep Human Activity Recognition Using Wearable Sensors
    Lawal, Isah A.
    Bano, Sophia
    12TH ACM INTERNATIONAL CONFERENCE ON PERVASIVE TECHNOLOGIES RELATED TO ASSISTIVE ENVIRONMENTS (PETRA 2019), 2019, : 45 - 48
  • [22] Human Activity Recognition with Inertial Sensors using a Deep Learning Approach
    Zebin, Tahmina
    Scully, Patricia J.
    Ozanyan, Krikor B.
    2016 IEEE SENSORS, 2016,
  • [23] A Sensors-Based River Water Quality Assessment System Using Deep Neural Network
    Chopade, Swati
    Gupta, Hari Prabhat
    Mishra, Rahul
    Oswal, Aman
    Kumari, Preti
    Dutta, Tanima
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (16): : 14375 - 14384
  • [24] HDL: Hierarchical Deep Learning Model based Human Activity Recognition using Smartphone Sensors
    Su, Tongtong
    Sun, Huazhi
    Ma, Chunmei
    Jiang, Lifen
    Xu, Tongtong
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [25] An Efficient Hierarchical Multiscale and Multidimensional Feature Adaptive Fusion Network for Human Activity Recognition Using Wearable Sensors
    Li, Xinya
    Xu, Hongji
    Wang, Yang
    Zeng, Jiaqi
    Li, Yiran
    Li, Xiaoman
    Ai, Wentao
    Zheng, Hao
    Duan, Yupeng
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (06): : 6492 - 6505
  • [26] A Multiscale Cross-Modal Interactive Fusion Network for Human Activity Recognition Using Wearable Sensors and Smartphones
    Yang, Xin
    Xu, Zeju
    Liu, Haodong
    Shull, Peter B.
    Redmond, Stephen
    Liu, Guanzheng
    Wang, Changhong
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (16): : 27139 - 27152
  • [27] Quantitative Analysis of Mother Wavelet Function Selection for Wearable Sensors-Based Human Activity Recognition
    Nematallah, Heba
    Rajan, Sreeraman
    SENSORS, 2024, 24 (07)
  • [28] Wearable Sensors-Based Hand Gesture Recognition for Human-Robot Collaboration in Construction
    Wang, Xin
    Veeramani, Dharmaraj
    Zhu, Zhenhua
    IEEE SENSORS JOURNAL, 2023, 23 (01) : 495 - 505
  • [29] CHARM-Deep: Continuous Human Activity Recognition Model Based on Deep Neural Network Using IMU Sensors of Smartwatch
    Ashry, Sara
    Ogawa, Tetsuji
    Gomaa, Walid
    IEEE SENSORS JOURNAL, 2020, 20 (15) : 8757 - 8770
  • [30] Deep Human Activity Recognition With Localisation of Wearable Sensors
    Lawal, Isah A.
    Bano, Sophia
    IEEE ACCESS, 2020, 8 : 155060 - 155070