Human Activity Recognition Using K-Nearest Neighbor Machine Learning Algorithm

被引:33
作者
Mohsen, Saeed [1 ]
Elkaseer, Ahmed [2 ]
Scholz, Steffen G. [2 ,3 ,4 ]
机构
[1] Al Madina Higher Inst Engn & Technol, Elect & Commun Engn Dept, Giza, Egypt
[2] Karlsruhe Inst Technol, Inst Automat & Appl Informat, D-76344 Karlsruhe, Germany
[3] Karlsruhe Nano Micro Facil KNMF, Eggenstein Leopoldshafen, Germany
[4] Swansea Univ, Coll Engn, Future Mfg Res Inst, Swansea SA1 8EN, W Glam, Wales
来源
SUSTAINABLE DESIGN AND MANUFACTURING, KES-SDM 2021 | 2022年 / 262卷
关键词
Machine learning; KNN; Human activity recognition; Industry; 4.0;
D O I
10.1007/978-981-16-6128-0_29
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Smart factory in the era of Industry 4.0 requires humans to have continuous communication capabilities among each other's and with the existing smart assets in order to integrate their activities into a cyber-physical system (CPS) within the smart factory. Machine learning (ML) algorithms can help precisely recognize the human activities, provided that well-designed and trained ML algorithms for high performance recognition are developed. This paper presents a k-nearest neighbor (KNN) algorithm for classification of human activities, namely Laying, Downstairs walking, Sitting, Upstairs walking, Standing, andWalking. This algorithm is trained and the algorithm's parameters are precisely tuned of for high accuracy achievement. Experimentally, a normalized confusion matrix, a classification report of human activities, receiver operating characteristic (ROC) curves, and precision-recall curves are used to analyze the performance of the KNN algorithm. The results show that the KNN algorithm provides a high performance in the classification of human activities. The weighted average precision, recall, F1-score, and the area under the micro-average precision-recall curve for the KNN are 90.96%, 90.46%, 90.37%, and 96.5%, respectively, while the area under the ROC curve is 100%.
引用
收藏
页码:304 / 313
页数:10
相关论文
共 17 条
[1]   Activity recognition from user-annotated acceleration data [J].
Bao, L ;
Intille, SS .
PERVASIVE COMPUTING, PROCEEDINGS, 2004, 3001 :1-17
[2]   A Tutorial on Human Activity Recognition Using Body-Worn Inertial Sensors [J].
Bulling, Andreas ;
Blanke, Ulf ;
Schiele, Bernt .
ACM COMPUTING SURVEYS, 2014, 46 (03)
[3]   Deep learning based multimodal complex human activity recognition using wearable devices [J].
Chen, Ling ;
Liu, Xiaoze ;
Peng, Liangying ;
Wu, Menghan .
APPLIED INTELLIGENCE, 2021, 51 (06) :4029-4042
[4]   Performance Analysis of Smartphone-Sensor Behavior for Human Activity Recognition [J].
Chen, Yufei ;
Shen, Chao .
IEEE ACCESS, 2017, 5 :3095-3110
[5]   Smart Phone Based Data Mining For Human Activity Recognition [J].
Chetty, Girija ;
White, Matthew ;
Akther, Farnaz .
PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGIES, ICICT 2014, 2015, 46 :1181-1187
[6]   Toward Personalized Activity Recognition Systems With a Semipopulation Approach [J].
Hong, Jin-Hyuk ;
Ramos, Julian ;
Dey, Anind K. .
IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2016, 46 (01) :101-112
[7]   Wearable Sport Activity Classification Based on Deep Convolutional Neural Network [J].
Hsu, Yu-Liang ;
Chang, Hsing-Cheng ;
Chiu, Yung-Jung .
IEEE ACCESS, 2019, 7 :170199-170212
[8]   A comparison of public datasets for acceleration-based fall detection [J].
Igual, Raul ;
Medrano, Carlos ;
Plaza, Inmaculada .
MEDICAL ENGINEERING & PHYSICS, 2015, 37 (09) :870-878
[9]  
kaggle, UCI HAR DAT AV KAGGL
[10]   Evaluation of human standing balance using wearable inertial sensors: A machine learning approach [J].
Lattanzi, Emanuele ;
Freschi, Valerio .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2020, 94