Activity-Based Person Identification Using Multimodal Wearable Sensor Data

被引:20
作者
Luo, Fei [1 ]
Khan, Salabat [1 ]
Huang, Yandao [1 ]
Wu, Kaishun [1 ]
机构
[1] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
关键词
Biometrics; feature fusion; machine learning; multimodal sensor; person identification; AUTHENTICATION;
D O I
10.1109/JIOT.2022.3209084
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Wearable devices equipped with a variety of sensors facilitate the measurement of physiological and behavioral characteristics. Activity-based person identification is considered an emerging and fast-evolving technology in security and access control fields. Wearables, such as smartphones, Apple Watch, and Google glass can continuously sense and collect activity-related information of users, and activity patterns can be extracted for differentiating different people. Although various human activities have been widely studied, few of them (gaits and keystrokes) have been used for person identification. In this article, we performed person identification using two public benchmark data sets (UCI-HAR and WISDM2019), which are collected from several different activities using multimodal sensors (accelerometer and gyroscope) embedded in wearable devices (smartphone and smartwatch). We implemented eight classifiers, including an multivariate squeeze-and-excitation network (MSENet), time series transformer (TST), temporal convolutional network (TCN), CNN-LSTM, ConvLSTM, XGBoost, decision tree, and k-nearest neighbor. The proposed MSENet can model the relationship between different sensor data. It achieved the best person identification accuracies under different activities of 91.31% and 97.79%, respectively, for the public data sets of UCI-HAR and WISDM2019. We also investigated the effects of sensor modality, human activity, feature fusion, and window size for sensor signal segmentation. Compared to the related work, our approach has achieved the state of the art.
引用
收藏
页码:1711 / 1723
页数:13
相关论文
共 64 条
[31]   An introduction to biometric recognition [J].
Jain, AK ;
Ross, A ;
Prabhakar, S .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2004, 14 (01) :4-20
[32]   Deep ensemble learning approach for lower extremity activities recognition using wearable sensors [J].
Jain, Rahul ;
Semwal, Vijay Bhaskar ;
Kaushik, Praveen .
EXPERT SYSTEMS, 2022, 39 (06)
[33]   Multi-frequency and multi-domain human activity recognition based on SFCW radar using deep learning [J].
Jia, Yong ;
Guo, Yong ;
Wang, Gang ;
Song, Ruiyuan ;
Cui, Guolong ;
Zhong, Xiaoling .
NEUROCOMPUTING, 2021, 444 :274-287
[34]  
Kingma DP, 2014, ADV NEUR IN, V27
[35]  
Kumar B. V. K., 2011, ENCY CRYPTOGRAPHY SE
[36]  
Kwapisz J.R., 2010, Biometrics: Theory Applications and Systems (BTAS), 2010 Fourth IEEE International Conference, P1, DOI [DOI 10.1109/BTAS.2010.5634532, 10.1109/btas.2010.5634532, 10.1109/BTAS.2010.5634532]
[37]   A Survey on Human Activity Recognition using Wearable Sensors [J].
Lara, Oscar D. ;
Labrador, Miguel A. .
IEEE COMMUNICATIONS SURVEYS AND TUTORIALS, 2013, 15 (03) :1192-1209
[38]   Temporal Convolutional Networks for Action Segmentation and Detection [J].
Lea, Colin ;
Flynn, Michael D. ;
Vidal, Rene ;
Reiter, Austin ;
Hager, Gregory D. .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :1003-1012
[39]   Deep ConvLSTM Network with Dataset Resampling for Upper Body Activity Recognition Using Minimal Number of IMU Sensors [J].
Lim, Xiang Yang ;
Gan, Kok Beng ;
Abd Aziz, Noor Azah .
APPLIED SCIENCES-BASEL, 2021, 11 (08)
[40]   Modeling and discovering human behavior from smartphone sensing life-log data for identification purpose [J].
Mafrur, Rischan ;
Nugraha, I. Gde Dharma ;
Choi, Deokjai .
HUMAN-CENTRIC COMPUTING AND INFORMATION SCIENCES, 2015, 5