A Comparison of Wearable Sensor Configuration Methods for Human Activity Recognition Using CNN

被引:0
作者
Tong, Lina [1 ]
Lin, Qianzhi [1 ]
Qin, Chuanlei [1 ]
Peng, Liang [2 ]
机构
[1] China Univ Min & Technol Being, Sch Mech Elect & Informat Engn, Beijing, Peoples R China
[2] Chinese Acad Sci, Inst Automat, State Key Lab Management & Control Complex Syst, Beijing, Peoples R China
来源
PROCEEDINGS OF THE 2021 IEEE INTERNATIONAL CONFERENCE ON PROGRESS IN INFORMATICS AND COMPUTING (PIC) | 2021年
关键词
convolutional neural network (CNN); human activity recognition (HAR); sensor configuration methods; wearable sensors;
D O I
10.1109/PIC53636.2021.9687056
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The number and location configuration methods of wearable sensors for human activity recognition (HAR) are analytically discussed. Based on the publicly available Daily and Sports Activities data set, a convolutional neural network (CNN) was built to recognize 19 kinds of daily and sports activities, and then the model was optimized for better performance. The results of numerous comparative experiments show that deep learning-based HAR is better than machine learning-based HAR in terms of accuracy, and its improvement in accuracy is not directly related to the increase of sensor quantity. Due to its strong capability of feature extraction, deep learning extracts not only activity-related features but also individual differences, therefore, the location with less individual randomness should be selected according to practical engineering. Moreover, the results are also influenced by the limb symmetry in the data set. Finally, the feasibility of achieving higher accuracy with fewer sensors is proved.
引用
收藏
页码:288 / 292
页数:5
相关论文
共 20 条
[1]   Deep Learning to Predict Falls in Older Adults Based on Daily-Life Trunk Accelerometry [J].
Aicha, Ahmed Nait ;
Englebienne, Gwenn ;
van Schooten, Kimberley S. ;
Pijnappels, Mirjam ;
Krose, Ben .
SENSORS, 2018, 18 (05)
[2]   Physical Human Activity Recognition Using Wearable Sensors [J].
Attal, Ferhat ;
Mohammed, Samer ;
Dedabrishvili, Mariam ;
Chamroukhi, Faicel ;
Oukhellou, Latifa ;
Amirat, Yacine .
SENSORS, 2015, 15 (12) :31314-31338
[3]   Sensor Type, Axis, and Position-Based Fusion and Feature Selection for Multimodal Human Daily Activity Recognition in Wearable Body Sensor Networks [J].
Badawi, Abeer A. ;
Al-Kabbany, Ahmad ;
Shaban, Heba A. .
JOURNAL OF HEALTHCARE ENGINEERING, 2020, 2020
[4]  
Bai Shaojie, 2018, CoRR
[5]   Recognizing Daily and Sports Activities in Two Open Source Machine Learning Environments Using Body-Worn Sensor Units [J].
Barshan, Billur ;
Yuksek, Murat Cihan .
COMPUTER JOURNAL, 2014, 57 (11) :1649-1667
[6]   A survey of depth and inertial sensor fusion for human action recognition [J].
Chen, Chen ;
Jafari, Roozbeh ;
Kehtarnavaz, Nasser .
MULTIMEDIA TOOLS AND APPLICATIONS, 2017, 76 (03) :4405-4425
[7]   Sensor Data Required for Automatic Recognition of Athletic Tasks Using Deep Neural Networks [J].
Clouthier, Allison L. ;
Ross, Gwyneth B. ;
Graham, Ryan B. .
FRONTIERS IN BIOENGINEERING AND BIOTECHNOLOGY, 2020, 7
[8]   Determining the optimal number of body-worn sensors for human activity recognition [J].
Ertugrul, Omer Faruk ;
Kaya, Yilmaz .
SOFT COMPUTING, 2017, 21 (17) :5053-5060
[9]  
Ioffe S, 2015, PR MACH LEARN RES, V37, P448
[10]   Activity Recognition Using Complex Network Analysis [J].
Jalloul, Nahed ;
Poree, Fabienne ;
Viardot, Geoffrey ;
L'Hostis, Phillipe ;
Carrault, Guy .
IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2018, 22 (04) :989-1000