Deep learning models for real-life human activity recognition from smartphone sensor data

被引:4
|
作者
Garcia-Gonzalez, Daniel [1 ]
Rivero, Daniel [1 ]
Fernandez-Blanco, Enrique [1 ]
Luaces, Miguel R. [1 ]
机构
[1] Univ A Coruna, Dept Comp Sci & Informat Technol, CITIC, La Coruna 15071, Spain
关键词
HAR; CNN; LSTM; Real life; Smartphones; Sensors;
D O I
10.1016/j.iot.2023.100925
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Nowadays, the field of human activity recognition (HAR) is a remarkably hot topic within the scientific community. Given the low cost, ease of use and high accuracy of the sensors from different wearable devices and smartphones, more and more researchers are opting to do their bit in this area. However, until very recently, all the work carried out in this field was done in laboratory conditions, with very few similarities with our daily lives. This paper will focus on this new trend of integrating all the knowledge acquired so far into a real-life environment. Thus, a dataset already published following this philosophy was used. In this way, this work aims to be able to identify the different actions studied there. In order to perform this classification, this paper explores new designs and architectures for models inspired by the ones which have yielded the best results in the literature. More specifically, different configurations of Convolutional Neural Networks (CNN) and Long-Short Term Memory (LSTM) have been tested, but on real-life conditions instead of laboratory ones. It is worth mentioning that the hybrid models formed from these techniques yielded the best results, with a peak accuracy of 94.80% on the dataset used.
引用
收藏
页数:22
相关论文
共 50 条
  • [31] Industry 4.0-Oriented Deep Learning Models for Human Activity Recognition
    Mohsen, Saeed
    Elkaseer, Ahmed
    Scholz, Steffen G.
    IEEE ACCESS, 2021, 9 : 150508 - 150521
  • [32] Assessing impacts of data volume and data set balance in using deep learning approach to human activity recognition
    Chen, Haipeng
    Xiong, Fuhai
    Wu, Dihong
    Zheng, Lingxiang
    Peng, Ao
    Hong, Xuemin
    Tang, Biyu
    Lu, Hai
    Shi, Haibin
    Zheng, Huiru
    2017 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2017, : 1160 - 1165
  • [33] Ensemble Deep Learning Network for Enhancing Performances of Sensor-based Physical Activity Recognition Based on IMU Sensor Data
    Mekruksavanich, Sakorn
    Jantawong, Ponnipa
    Jitpattanaku, Anuchit
    2024 5TH INTERNATIONAL CONFERENCE ON BIG DATA ANALYTICS AND PRACTICES, IBDAP, 2024, : 150 - 155
  • [34] Uncovering Human Multimodal Activity Recognition with a Deep Learning Approach
    Ranieri, Caetano M.
    Vargas, Patricia A.
    Romero, Roseli A. F.
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [35] Human Activity Recognition Using Smartphone Sensor Based on Selective Classifiers
    Khatun, Mst Alema
    Abu Yousuf, Mohammad
    2020 2ND INTERNATIONAL CONFERENCE ON SUSTAINABLE TECHNOLOGIES FOR INDUSTRY 4.0 (STI), 2020,
  • [36] Deep Learning for Sensor-based Human Activity Recognition: Overview, Challenges, and Opportunities
    Chen, Kaixuan
    Zhang, Dalin
    Yao, Lina
    Guo, Bin
    Yu, Zhiwen
    Liu, Yunhao
    ACM COMPUTING SURVEYS, 2021, 54 (04)
  • [37] A Survey on Deep Learning for Human Activity Recognition
    Gu, Fuqiang
    Chung, Mu-Huan
    Chignell, Mark
    Valaee, Shahrokh
    Zhou, Baoding
    Liu, Xue
    ACM COMPUTING SURVEYS, 2021, 54 (08)
  • [38] Learning Human Activity From Visual Data Using Deep Learning
    Alhersh, Taha
    Stuckenschmidt, Heiner
    Rehman, Atiq Ur
    Belhaouari, Samir Brahim
    IEEE ACCESS, 2021, 9 : 106245 - 106253
  • [39] A Comparative Research on Human Activity Recognition Using Deep Learning
    Tufek, Nilay
    Ozkaya, Ozen
    2019 27TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2019,
  • [40] Deep Convolutional Neural Networks for Human Activity Recognition with Smartphone Sensors
    Ronao, Charissa Ann
    Cho, Sung-Bae
    NEURAL INFORMATION PROCESSING, ICONIP 2015, PT IV, 2015, 9492 : 46 - 53