An Efficient Hierarchical Multiscale and Multidimensional Feature Adaptive Fusion Network for Human Activity Recognition Using Wearable Sensors

被引:1
作者
Li, Xinya [1 ]
Xu, Hongji [1 ]
Wang, Yang [1 ]
Zeng, Jiaqi [1 ]
Li, Yiran [1 ]
Li, Xiaoman [1 ]
Ai, Wentao [1 ]
Zheng, Hao [1 ]
Duan, Yupeng [1 ]
机构
[1] Shandong Univ, Sch Informat Sci & Engn, Qingdao 266237, Peoples R China
关键词
Feature extraction; Human activity recognition; Internet of Things; Time-frequency analysis; Floors; Data mining; Convolution; Time series analysis; Long short term memory; Kernel; Hierarchical multiscale and multidimensional feature extraction; human activity recognition (HAR); sensor data; time-frequency domain feature;
D O I
10.1109/JIOT.2024.3491362
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
As the Internet of Things (IoT) technology advances, human activity recognition (HAR) using IoT devices, including wearable sensors has become prevalent in various applications. Nevertheless, many sensor-based HAR methods still struggle to balance recognition accuracy with network complexity. Meanwhile, most existing sensor-based HAR networks fail to achieve an effective fusion of multidimensional features. To address the above issues, a hierarchical multiscale time-frequency and channel feature adaptive fusion (HMTF-CFAF) network is put forward. The HMTF-CFAF efficiently extracts unique multiscale time-frequency and channel features in sensor data using hierarchical connectivity. Furthermore, it incorporates a feature fusion mechanism to integrate and exchange multiscale and multidimensional features, providing more comprehensive and richer features. To evaluate the HMTF-CFAF network, we utilize three datasets: 1) the University of California Irvine HAR (UCI-HAR); 2) physical activity monitoring for aging people (PAMAP2); and 3) self-collected household behavior (HB) dataset. The HMTF-CFAF network achieves the accuracies of 97.66%, 98.75%, and 98.80% on the above three datasets, respectively, demonstrating its excellent performance.
引用
收藏
页码:6492 / 6505
页数:14
相关论文
共 43 条
[1]   Attend and Discriminate: Beyond the State-of-the-Art for Human Activity Recognition UsingWearable Sensors [J].
Abedin, Alireza ;
Ehsanpour, Mahsa ;
Shi, Qinfeng ;
Rezatofighi, Hamid ;
Ranasinghe, Damith C. .
PROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT, 2021, 5 (01)
[2]  
Al Shalabi L., 2006, Journal of Computer Sciences, V2, P735, DOI 10.3844/jcssp.2006.735.739
[3]   TCN-Inception: Temporal Convolutional Network and Inception modules for sensor-based Human Activity Recognition [J].
Al-qaness, Mohammed A. A. ;
Dahou, Abdelghani ;
Trouba, Nafissa Toureche ;
Abd Elaziz, Mohamed ;
Helmi, Ahmed M. .
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2024, 160 :375-388
[4]   Multi-ResAtt: Multilevel Residual Network With Attention for Human Activity Recognition Using Wearable Sensors [J].
Al-qaness, Mohammed A. A. ;
Dahou, Abdelghani ;
Abd Elaziz, Mohamed ;
Helmi, A. M. .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (01) :144-152
[5]  
Anguita Davide, 2013, Esann, V3, P3, DOI DOI 10.1007/978-3-642-40728-4_54
[6]   IoT Wearable Sensor and Deep Learning: An Integrated Approach for Personalized Human Activity Recognition in a Smart Home Environment [J].
Bianchi, Valentina ;
Bassoli, Marco ;
Lombardo, Gianfranco ;
Fornacciari, Paolo ;
Mordonini, Monica ;
De Munari, Ilaria .
IEEE INTERNET OF THINGS JOURNAL, 2019, 6 (05) :8553-8562
[7]   A Novel Human Activity Recognition Scheme for Smart Health Using Multilayer Extreme Learning Machine [J].
Chen, Maojian ;
Li, Ying ;
Luo, Xiong ;
Wang, Weiping ;
Wang, Long ;
Zhao, Wenbing .
IEEE INTERNET OF THINGS JOURNAL, 2019, 6 (02) :1410-1418
[8]   MLCNNwav: Multilevel Convolutional Neural Network With Wavelet Transformations for Sensor-Based Human Activity Recognition [J].
Dahou, Abdelghani ;
Al-Qaness, Mohammed A. A. ;
Elaziz, Mohamed Abd ;
Helmi, Ahmed M. .
IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (01) :820-828
[9]  
Essa E., 2023, Knowl.-Based Syst., V278, P1
[10]  
Gao SH, 2021, Arxiv, DOI [arXiv:1904.01169, 10.48550/arXiv.1904.01169]