Multi-model weighted voting method based on convolutional neural network for human activity recognition

被引:1
作者
Ouyang, Kangyue [1 ]
Pan, Zhongliang [1 ]
机构
[1] South China Normal Univ, Sch Elect & Informat Engn, Guangzhou 510006, Peoples R China
关键词
Human activity recognition; Convolutional neural networks; Feature extraction; Two-dimensional graphs; Sensor data; Weighted voting; ENSEMBLE;
D O I
10.1007/s11042-023-17500-5
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In recent years, human activity recognition (HAR) has been widely used in medical rehabilitation, smart home and other fields. Currently, the recognition performance highly depends on feature extraction and effective algorithm. On the one hand, traditional manual feature extraction and classification algorithms hinder the improvement of HAR. On the other hand, the latest deep learning technology can automatically process data and extract features, but it faces the problems of poor feature quality and information loss. In order to solve this problem, this paper proposes a new recognition method using only wearable sensor data. In the feature extraction stage, the axis information of each sensor is extracted separately into one-dimensional data, and information of all axes is integrated into a two-dimensional graph. Then, two deep convolutional neural network models are designed to train the features based on one-dimensional data and two-dimensional graph respectively. Finally, weighted voting method is used to get the classification results. Experiments have shown that the average recognition accuracy of the method in this paper is about 3% higher than that of other HAR deep neural network methods, which shown the advantage of the method in this paper in obtaining better recognition result with limited data.
引用
收藏
页码:73305 / 73328
页数:24
相关论文
共 39 条
  • [1] Anguita D., 2013, ESANN, P24
  • [2] Arshad M., 2022, EAI Endorsed Transactions on Internet of Things, V7, P170006, DOI [10.4108/eai.26-5-2021.170006, DOI 10.4108/EAI.26-5-2021.170006]
  • [3] Convolutional Relational Machine for Group Activity Recognition
    Azar, Sina Mokhtarzadeh
    Atigh, Mina Ghadimi
    Nickabadi, Ahmad
    Alahi, Alexandre
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 7884 - 7893
  • [4] A Tutorial on Human Activity Recognition Using Body-Worn Inertial Sensors
    Bulling, Andreas
    Blanke, Ulf
    Schiele, Bernt
    [J]. ACM COMPUTING SURVEYS, 2014, 46 (03)
  • [5] Casale P, 2011, LECT NOTES COMPUT SC, V6669, P289
  • [6] On the use of ensemble of classifiers for accelerometer-based activity recognition
    Catal, Cagatay
    Tufekci, Selin
    Pirmit, Elif
    Kocabag, Guner
    [J]. APPLIED SOFT COMPUTING, 2015, 37 : 1018 - 1022
  • [7] Chen C, 2015, IEEE IMAGE PROC, P168, DOI 10.1109/ICIP.2015.7350781
  • [8] A Deep Learning Approach to Human Activity Recognition Based on Single Accelerometer
    Chen, Yuqing
    Xue, Yang
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC 2015): BIG DATA ANALYTICS FOR HUMAN-CENTRIC SYSTEMS, 2015, : 1488 - 1492
  • [9] A Novel Ensemble ELM for Human Activity Recognition Using Smartphone Sensors
    Chen, Zhenghua
    Jiang, Chaoyang
    Xie, Lihua
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2019, 15 (05) : 2691 - 2699
  • [10] Distilling the Knowledge From Handcrafted Features for Human Activity Recognition
    Chen, Zhenghua
    Zhang, Le
    Cao, Zhiguang
    Guo, Jing
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2018, 14 (10) : 4334 - 4342