Comparison of Different Sets of Features for Human Activity Recognition by Wearable Sensors

被引:50
作者
Rosati, Samanta [1 ]
Balestra, Gabriella [1 ]
Knaflitz, Marco [1 ]
机构
[1] Politecn Torino, Dept Elect & Telecommun, I-10129 Turin, Italy
关键词
human activity recognition; wearable sensors; MIMU; genetic algorithm; feature selection; classifier optimization; machine learning; DATA FUSION; SYSTEM;
D O I
10.3390/s18124189
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Human Activity Recognition (HAR) refers to an emerging area of interest for medical, military, and security applications. However, the identification of the features to be used for activity classification and recognition is still an open point. The aim of this study was to compare two different feature sets for HAR. Particularly, we compared a set including time, frequency, and time-frequency domain features widely used in literature (FeatSet_A) with a set of time-domain features derived by considering the physical meaning of the acquired signals (FeatSet_B). The comparison of the two sets were based on the performances obtained using four machine learning classifiers. Sixty-one healthy subjects were asked to perform seven different daily activities wearing a MIMU-based device. Each signal was segmented using a 5-s window and for each window, 222 and 221 variables were extracted for the FeatSet_A and FeatSet_B respectively. Each set was reduced using a Genetic Algorithm (GA) simultaneously performing feature selection and classifier optimization. Our results showed that Support Vector Machine achieved the highest performances using both sets (97.1% and 96.7% for FeatSet_A and FeatSet_B respectively). However, FeatSet_B allows to better understand alterations of the biomechanical behavior in more complex situations, such as when applied to pathological subjects.
引用
收藏
页数:16
相关论文
共 58 条
[11]  
[Anonymous], 2016, J MECH MED BIOL
[12]   Physical Human Activity Recognition Using Wearable Sensors [J].
Attal, Ferhat ;
Mohammed, Samer ;
Dedabrishvili, Mariam ;
Chamroukhi, Faicel ;
Oukhellou, Latifa ;
Amirat, Yacine .
SENSORS, 2015, 15 (12) :31314-31338
[13]   Evolutionary Design of Convolutional Neural Networks for Human Activity Recognition in Sensor-Rich Environments [J].
Baldominos, Alejandro ;
Saez, Yago ;
Isasi, Pedro .
SENSORS, 2018, 18 (04)
[14]   Activity recognition from user-annotated acceleration data [J].
Bao, L ;
Intille, SS .
PERVASIVE COMPUTING, PROCEEDINGS, 2004, 3001 :1-17
[15]   Optimizing multi-sensor deployment via ensemble pruning for wearable activity recognition [J].
Cao, Jingjing ;
Li, Wenfeng ;
Ma, Congcong ;
Tao, Zhiwen .
INFORMATION FUSION, 2018, 41 :68-79
[16]   A Locomotion Intent Prediction System Based on Multi-Sensor Fusion [J].
Chen, Baojun ;
Zheng, Enhao ;
Wang, Qining .
SENSORS, 2014, 14 (07) :12349-12369
[17]  
De Leonardis G, 2018, IEEE INT SYM MED MEA, P564
[18]  
Doewes Afrizal, 2017, 2017 IEEE International Conference on Consumer Electronics - Taiwan (ICCE-TW), P171, DOI 10.1109/ICCE-China.2017.7991050
[19]  
Engelbrecht A.P, 2007, Computational Intelligence an Introduction, Vsecond
[20]   Feature Representation and Data Augmentation for Human Activity Classification Based on Wearable IMU Sensor Data Using a Deep LSTM Neural Network [J].
Eyobu, Odongo Steven ;
Han, Dong Seog .
SENSORS, 2018, 18 (09)