A Novel IoT-Perceptive Human Activity Recognition (HAR) Approach Using Multihead Convolutional Attention

被引:134
作者
Zhang, Haoxi [1 ]
Xiao, Zhiwen [1 ]
Wang, Juan [1 ]
Li, Fei [1 ]
Szczerbicki, Edward [2 ]
机构
[1] Chengdu Univ Informat Technol, Sch Cybersecur, Chengdu 610225, Peoples R China
[2] Gdansk Univ Technol, Fac Management & Econ, Dept Management, PL-80233 Gdansk, Poland
来源
IEEE INTERNET OF THINGS JOURNAL | 2020年 / 7卷 / 02期
关键词
Feature extraction; Activity recognition; Internet of Things; Convolutional neural networks; Wireless sensor networks; Computer architecture; Attention mechanism; deep learning; human activity recognition (HAR); Internet of Things (IoT); INTERNET; THINGS;
D O I
10.1109/JIOT.2019.2949715
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Together with the fast advancement of the Internet of Things (IoT), smart healthcare applications and systems are equipped with increasingly more wearable sensors and mobile devices. These sensors are used not only to collect data but also, and more importantly, to assist in daily activity tracking and analyzing of their users. Various human activity recognition (HAR) approaches are used to enhance such tracking. Most of the existing HAR methods depend on exploratory case-based shallow feature learning architectures, which struggle with correct activity recognition when put into real-life practice. To tackle this problem, we propose a novel approach that utilizes the convolutional neural networks (CNNs) and the attention mechanism for HAR. In the presented method, the activity recognition accuracy is improved by incorporating attention into multihead CNNs for better feature extraction and selection. Proof of concept experiments are conducted on a publicly available data set from wireless sensor data mining (WISDM) lab. The results demonstrate a higher accuracy of our proposed approach in comparison with the current methods.
引用
收藏
页码:1072 / 1080
页数:9
相关论文
共 34 条
[1]  
[Anonymous], IEEE ACCESS IN PRESS
[2]   Fast Spectrogram Inversion Using Multi-Head Convolutional Neural Networks [J].
Arik, Sercan O. ;
Jun, Heewoo ;
Diamos, Gregory .
IEEE SIGNAL PROCESSING LETTERS, 2019, 26 (01) :94-98
[3]  
Ashton K., 2009, RFID J, V22, P97
[4]   The Internet of Things: A survey [J].
Atzori, Luigi ;
Iera, Antonio ;
Morabito, Giacomo .
COMPUTER NETWORKS, 2010, 54 (15) :2787-2805
[5]  
Bahdanau D., 2015, P 3 INT C LEARN REPR
[6]   A Tutorial on Human Activity Recognition Using Body-Worn Inertial Sensors [J].
Bulling, Andreas ;
Blanke, Ulf ;
Schiele, Bernt .
ACM COMPUTING SURVEYS, 2014, 46 (03)
[7]   Performance Analysis of Smartphone-Sensor Behavior for Human Activity Recognition [J].
Chen, Yufei ;
Shen, Chao .
IEEE ACCESS, 2017, 5 :3095-3110
[8]  
Chorowski J, 2015, ADV NEUR IN, V28
[9]  
Du HZ, 2018, INT CONF SYST INFORM, P1170, DOI 10.1109/ICSAI.2018.8599366
[10]   A framework for collaborative computing and multi-sensor data fusion in body sensor networks [J].
Fortino, Giancarlo ;
Galzarano, Stefano ;
Gravina, Raffaele ;
Li, Wenfeng .
INFORMATION FUSION, 2015, 22 :50-70