WOODY: A Post-Process Method for Smartphone-Based Activity Recognition

被引:10
作者
Wang, Changhai [1 ]
Xu, Yuwei [2 ]
Liang, Hui [1 ]
Huang, Wanwei [1 ]
Zhang, Ling [1 ]
机构
[1] Zhengzhou Univ Light Ind, Software Engn Coll, Zhengzhou 450002, Henan, Peoples R China
[2] Nankai Univ, Coll Cyberspace Secur, Tianjin 300350, Peoples R China
基金
中国国家自然科学基金;
关键词
Human activity recognition; smartphone; post-process method; hidden Markov model; classification confidence; ENSEMBLE; BEHAVIOR;
D O I
10.1109/ACCESS.2018.2866872
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the past decade, the rapid popularization of smartphone has provided a promising direction for human activity recognition. Despite identifying a variety of movements without any complicated wearable device, the smartphone-based activity recognition is still deeply affected by the differences between users and phone locations. To overcome this problem, post-process attempts to correct the errors in the classified activity sequence. In consideration of both the activity sequence continuity and the recognition result confidence, we propose WOODY, a novel post-process method that locates and corrects the errors in a classified activity sequence just like Woody Woodpecker pecking holes to catch the pests. In our method, the recognition result is considered as the weighted observation state, and a weighted observation hidden Markov model (WOHMM) is built to model the classified activity sequence. Consequently, a sequence labeling algorithm of theWOHMMis also designed to modify those recognition results with low confidence. To validate the effectiveness of WOODY, we make a series of contrast experiments on two public data sets collected from real scenarios. The results show that WOODY is not only able to improve the recognition accuracy but also significantly enhance the robustness.
引用
收藏
页码:49611 / 49625
页数:15
相关论文
共 56 条
[1]  
[Anonymous], SMART HLTH
[2]  
[Anonymous], J COMPUTATIONAL INFO
[3]  
[Anonymous], INT J COMPUT APPL EN
[4]   Optimizing multi-sensor deployment via ensemble pruning for wearable activity recognition [J].
Cao, Jingjing ;
Li, Wenfeng ;
Ma, Congcong ;
Tao, Zhiwen .
INFORMATION FUSION, 2018, 41 :68-79
[5]   GCHAR: An efficient Group-based Context-aware human activity recognition on smartphone [J].
Cao, Liang ;
Wang, Yufeng ;
Zhang, Bo ;
Jin, Qun ;
Vasilakos, Athanasios V. .
JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2018, 118 :67-80
[6]   On the use of ensemble of classifiers for accelerometer-based activity recognition [J].
Catal, Cagatay ;
Tufekci, Selin ;
Pirmit, Elif ;
Kocabag, Guner .
APPLIED SOFT COMPUTING, 2015, 37 :1018-1022
[7]   Extreme learning machine-based device displacement free activity recognition model [J].
Chen, Yiqiang ;
Zhao, Zhongtang ;
Wang, Shuangquan ;
Chen, Zhenyu .
SOFT COMPUTING, 2012, 16 (09) :1617-1625
[8]   Performance Analysis of Smartphone-Sensor Behavior for Human Activity Recognition [J].
Chen, Yufei ;
Shen, Chao .
IEEE ACCESS, 2017, 5 :3095-3110
[9]   Divide and Conquer-Based 1D CNN Human Activity Recognition Using Test Data Sharpening [J].
Cho, Heeryon ;
Yoon, Sang Min .
SENSORS, 2018, 18 (04)
[10]   Real-time activity monitoring with a wristband and a smartphone [J].
Cvetkovic, Bozidara ;
Szeklicki, Robert ;
Janko, Vito ;
Lutomski, Przemyslaw ;
Lustrek, Mitja .
INFORMATION FUSION, 2018, 43 :77-93