Evaluation of deep learning model for human activity recognition

被引:7
|
作者
Bhat, Owais [1 ]
Khan, Dawood A. [1 ]
机构
[1] Univ Kashmir, Dept Comp Sci, North Campus, Srinagar 190006, Jammu & Kashmir, India
关键词
Deep learning; Sensors; Activity recognition; Classification; Convolutional neural network;
D O I
10.1007/s12530-021-09373-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recognizing a person's physical activity with certainty makes an important aspect of intelligent computing. Modern smart devices are equipped with powerful sensors that are suitable for sensor-based human activity recognition (AR) task. Traditional approaches to human activity recognition has made significant progress but most of those methods rely upon manual feature extraction. The design and selection of relevant features is the most challenging task in sensor-based human AR problem. Using manually extracted features for this task hinders the generalization of performance and these handcrafted features are also incapable of handling similar and complex activities with certainty. In this paper, we propose a deep learning based method for human activity recognition problem. The method uses convolutional neural networks to automatically extract features from raw sensor data and classify six basic human activities. Furthermore, transfer learning is used to reduce the computational cost involved in training the model from scratch for a new user. The model uses the labelled information from supervised learning, to mutually enhance the feature extraction and classification. Experiments carried on benchmark dataset verified the strong advantage of proposed method over the traditional human AR algorithms such as Random Forest (RF) and multiclass Support Vector Machine (SVM).
引用
收藏
页码:159 / 168
页数:10
相关论文
共 50 条
  • [21] Egocentric Vision for Human Activity Recognition Using Deep Learning
    Douache, Malika
    Benmoussat, Badra Nawal
    JOURNAL OF INFORMATION PROCESSING SYSTEMS, 2023, 19 (06): : 730 - 744
  • [22] A Comparative Research on Human Activity Recognition Using Deep Learning
    Tufek, Nilay
    Ozkaya, Ozen
    2019 27TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2019,
  • [23] Human activity recognition from uav videos using an optimized hybrid deep learning model
    Sinha, Kumari Priyanka
    Kumar, Prabhat
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (17) : 51669 - 51698
  • [24] Human Activity Recognition using Deep Learning
    Moola, Ramu
    Hossain, Ashraf
    2022 URSI REGIONAL CONFERENCE ON RADIO SCIENCE, USRI-RCRS, 2022, : 165 - 168
  • [25] Attention-Based Deep Learning Framework for Human Activity Recognition With User Adaptation
    Buffelli, Davide
    Vandin, Fabio
    IEEE SENSORS JOURNAL, 2021, 21 (12) : 13474 - 13483
  • [26] Human Activity Recognition With Smartphone and Wearable Sensors Using Deep Learning Techniques: A Review
    Ramanujam, E.
    Perumal, Thinagaran
    Padmavathi, S.
    IEEE SENSORS JOURNAL, 2021, 21 (12) : 13029 - 13040
  • [27] Intelligent Localization and Deep Human Activity Recognition through IoT Devices
    Alazeb, Abdulwahab
    Azmat, Usman
    Al Mudawi, Naif
    Alshahrani, Abdullah
    Alotaibi, Saud S.
    Almujally, Nouf Abdullah
    Jalal, Ahmad
    SENSORS, 2023, 23 (17)
  • [28] A Novel Deep Learning Model for Smartphone-Based Human Activity Recognition
    Agti, Nadia
    Sabri, Lyazid
    Kazar, Okba
    Chibani, Abdelghani
    MOBILE AND UBIQUITOUS SYSTEMS: COMPUTING, NETWORKING AND SERVICES, MOBIQUITOUS 2023, PT II, 2024, 594 : 231 - 243
  • [29] Human Physical Activity Recognition Based on Computer Vision with Deep Learning Model
    Mo, Lingfei
    Li, Fan
    Zhu, Yanjia
    Huang, Anjie
    2016 IEEE INTERNATIONAL INSTRUMENTATION AND MEASUREMENT TECHNOLOGY CONFERENCE PROCEEDINGS, 2016, : 1211 - 1216
  • [30] An active semi-supervised deep learning model for human activity recognition
    Haixia Bi
    Miquel Perello-Nieto
    Raul Santos-Rodriguez
    Peter Flach
    Ian Craddock
    Journal of Ambient Intelligence and Humanized Computing, 2023, 14 : 13049 - 13065