Motion Primitive Forests for Human Activity Recognition Using Wearable Sensors

被引:4
|
作者
Nguyen Ngoc Diep [1 ,2 ]
Cuong Pham [1 ,2 ]
Tu Minh Phuong [1 ,2 ]
机构
[1] Posts & Telecommun Inst Technol, Dept Comp Sci, Hanoi, Vietnam
[2] Posts & Telecommun Inst Technol, Machine Learning & Applicat Lab, Hanoi, Vietnam
来源
PRICAI 2016: TRENDS IN ARTIFICIAL INTELLIGENCE | 2016年 / 9810卷
关键词
Human activity recognition; Wearable sensors; Motion primitive forests; Random forests; Bag of features;
D O I
10.1007/978-3-319-42911-3_29
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Human activity recognition is important in many applications such as fitness logging, pervasive healthcare, near-emergency warning, and social networking. Using body-worn sensors, these applications detect activities of the users to understand the context and provide them appropriate assistance. For accurate recognition, it is crucial to design appropriate feature representation of sensor data. In this paper, we propose a new type of motion features: motion primitive forests, which are randomized ensembles of decision trees that act on original local features by clustering them to form motion primitives (or words). The bags of these features, which accumulate histograms of the resulting motion primitives over each data frame, are then used to build activity models. We experimentally validated the effectiveness of the proposed method on accelerometer data on three benchmark datasets. On all three datasets, the proposed motion primitive forests provided substantially higher accuracy than existing state-of-the-art methods, and were much faster in both training and prediction, compared with k-means feature learning. In addition, the method showed stable results over different types of original local features, indicating the ability of random forests in selecting relevant local features.
引用
收藏
页码:340 / 353
页数:14
相关论文
共 50 条
  • [1] Sparse Representation for Motion Primitive-Based Human Activity Modeling and Recognition Using Wearable Sensors
    Zhang, Mi
    Xu, Wenyao
    Sawchuk, Alexander A.
    Sarrafzadeh, Majid
    2012 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR 2012), 2012, : 1807 - 1810
  • [2] Human Activity Recognition Using Wearable Accelerometer Sensors
    Zubair, Muhammad
    Song, Kibong
    Yoon, Changwoo
    2016 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS-ASIA (ICCE-ASIA), 2016,
  • [3] Deep Human Activity Recognition Using Wearable Sensors
    Lawal, Isah A.
    Bano, Sophia
    12TH ACM INTERNATIONAL CONFERENCE ON PERVASIVE TECHNOLOGIES RELATED TO ASSISTIVE ENVIRONMENTS (PETRA 2019), 2019, : 45 - 48
  • [4] A Survey on Human Activity Recognition using Wearable Sensors
    Lara, Oscar D.
    Labrador, Miguel A.
    IEEE COMMUNICATIONS SURVEYS AND TUTORIALS, 2013, 15 (03): : 1192 - 1209
  • [5] Physical Human Activity Recognition Using Wearable Sensors
    Attal, Ferhat
    Mohammed, Samer
    Dedabrishvili, Mariam
    Chamroukhi, Faicel
    Oukhellou, Latifa
    Amirat, Yacine
    SENSORS, 2015, 15 (12) : 31314 - 31338
  • [6] Orientation Independent Activity/Gesture Recognition Using Wearable Motion Sensors
    Wu, Jian
    Jafari, Roozbeh
    IEEE INTERNET OF THINGS JOURNAL, 2019, 6 (02): : 1427 - 1437
  • [7] Segmentation and recognition of human motion sequences using wearable inertial sensors
    Guo, Ming
    Wang, Zhelong
    MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (16) : 21201 - 21220
  • [8] Segmentation and recognition of human motion sequences using wearable inertial sensors
    Ming Guo
    Zhelong Wang
    Multimedia Tools and Applications, 2018, 77 : 21201 - 21220
  • [9] Energy Efficient Human Activity Recognition Using Wearable Sensors
    Ding, Genming
    Tian, Jun
    Wu, Jinsong
    Zhao, Qian
    Xie, Lili
    2018 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE WORKSHOPS (WCNCW), 2018, : 379 - 383
  • [10] An Improved Algorithm for Human Activity Recognition Using Wearable Sensors
    Chen, Ye
    Guo, Ming
    Wang, Zhelong
    2016 EIGHTH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2016, : 248 - 252