Sensor Data Acquisition and Multimodal Sensor Fusion for Human Activity Recognition Using Deep Learning

被引:139
作者
Chung, Seungeun [1 ]
Lim, Jiyoun [1 ]
Noh, Kyoung Ju [1 ]
Kim, Gague [1 ]
Jeong, Hyuntae [1 ]
机构
[1] Elect & Telecommun Res Inst, SW Contents Basic Technol Res Grp, Daejeon 34129, South Korea
关键词
mobile sensing; sensor position; human activity recognition; multimodal sensor fusion; classifier-level ensemble; Long Short-Term Memory network; deep learning; CONTEXT; SYSTEM;
D O I
10.3390/s19071716
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
In this paper, we perform a systematic study about the on-body sensor positioning and data acquisition details for Human Activity Recognition (HAR) systems. We build a testbed that consists of eight body-worn Inertial Measurement Units (IMU) sensors and an Android mobile device for activity data collection. We develop a Long Short-Term Memory (LSTM) network framework to support training of a deep learning model on human activity data, which is acquired in both real-world and controlled environments. From the experiment results, we identify that activity data with sampling rate as low as 10 Hz from four sensors at both sides of wrists, right ankle, and waist is sufficient in recognizing Activities of Daily Living (ADLs) including eating and driving activity. We adopt a two-level ensemble model to combine class-probabilities of multiple sensor modalities, and demonstrate that a classifier-level sensor fusion technique can improve the classification performance. By analyzing the accuracy of each sensor on different types of activity, we elaborate custom weights for multimodal sensor fusion that reflect the characteristic of individual activities.
引用
收藏
页数:20
相关论文
共 48 条
[1]  
[Anonymous], NEURAL COMPUT
[2]  
[Anonymous], SMART WIR SENS MACH
[3]  
[Anonymous], REAL TIM PHYS SIGN E
[4]  
[Anonymous], P ACCV 2014 SING 1 5
[5]  
[Anonymous], 2013, ESANN
[6]   Sensor Positioning for Activity Recognition Using Wearable Accelerometers [J].
Atallah, Louis ;
Lo, Benny ;
King, Rachel ;
Yang, Guang-Zhong .
IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, 2011, 5 (04) :320-329
[7]  
Bruno B, 2013, IEEE INT CONF ROBOT, P1602, DOI 10.1109/ICRA.2013.6630784
[8]   The Opportunity challenge: A benchmark database for on-body sensor-based activity recognition [J].
Chavarriaga, Ricardo ;
Sagha, Hesam ;
Calatroni, Alberto ;
Digumarti, Sundara Tejaswi ;
Troester, Gerhard ;
Millan, Jose del R. ;
Roggen, Daniel .
PATTERN RECOGNITION LETTERS, 2013, 34 (15) :2033-2042
[9]   A Deep Learning Approach to Human Activity Recognition Based on Single Accelerometer [J].
Chen, Yuqing ;
Xue, Yang .
2015 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC 2015): BIG DATA ANALYTICS FOR HUMAN-CENTRIC SYSTEMS, 2015, :1488-1492
[10]  
Chung S, 2018, I C INF COMM TECH CO, P154, DOI 10.1109/ICTC.2018.8539473