A Sensors Based Deep Learning Model for Unseen Locomotion Mode Identification using Multiple Semantic Matrices

被引:19
作者
Mishra, Rahul [1 ]
Gupta, Ashish [1 ]
Gupta, Hari Prabhat [1 ]
Dutta, Tanima [1 ]
机构
[1] Indian Inst Technol BHU Varanasi, Dept Comp Sci & Engn, Varanasi 221005, Uttar Pradesh, India
关键词
Machine learning; Feature extraction; Sensors; Semantics; Training; Data models; Computational modeling; Deep learning; identification; locomotion mode; sensors; CLASSIFICATION;
D O I
10.1109/TMC.2020.3015546
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With the availability of various sensors in the smartphone, identifying a locomotion mode becomes convenient and effortless in recent years. Information about locomotion mode helps to improve journey planning, travel time estimation, and traffic management. Though there exists a significant amount of work towards locomotion mode recognition, the performance of these work is not pertinent and heavily depends on the labeled training instances. As it is impractical to gather a prior information (labeled instances) about all types of locomotion modes, the recognition model should be able to identify a new or unseen locomotion mode without having any corresponding training instance. This paper proposes a sensors based deep learning model to identify a locomotion mode by using labeled training instances. The approach also incorporates a concept of Zero-Shot learning to identify an unseen locomotion mode. The model obtains an attribute matrix based on the fusion of three semantic matrices. It also constructs a feature matrix by extracting the deep learning and hand-crafted features from the training instances. Later, the model builds a classifier by learning a mapping between attribute and feature matrices. Finally, this work evaluates the performance of the approach on collected and existing datasets using accuracy and F1 score.
引用
收藏
页码:799 / 810
页数:12
相关论文
共 37 条
[1]   Zero-Shot Human Activity Recognition Using Non-Visual Sensors [J].
Al Machot, Fadi ;
Elkobaisi, Mohammed R. ;
Kyamakya, Kyandoghere .
SENSORS, 2020, 20 (03)
[2]   Terrain Classification From Body-Mounted Cameras During Human Locomotion [J].
Anantrasirichai, Nantheera ;
Burn, Jeremy ;
Bull, David .
IEEE TRANSACTIONS ON CYBERNETICS, 2015, 45 (10) :2249-2260
[3]   A Phase Variable Approach for IMU-Based Locomotion Activity Recognition [J].
Bartlett, Harrison L. ;
Goldfarb, Michael .
IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2018, 65 (06) :1330-1338
[4]   HuMAn: Complex Activity Recognition with Multi-Modal Multi-Positional Body Sensing [J].
Bharti, Pratool ;
De, Debraj ;
Chellappan, Sriram ;
Das, Sajal K. .
IEEE TRANSACTIONS ON MOBILE COMPUTING, 2019, 18 (04) :857-870
[5]  
Brent Richard P, 2010, Modern computer arithmetic, V18
[6]   Towards Zero-Shot Learning for Human Activity Recognition Using Semantic Attribute Sequence Model [J].
Cheng, Heng-Tze ;
Griss, Martin ;
Davis, Paul ;
Li, Jianguo ;
You, Di .
UBICOMP'13: PROCEEDINGS OF THE 2013 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING, 2013, :355-358
[7]  
Cheng Heng-Tze, 2013, MobiSys'13, DOI [10.1145/2462456.2464438, DOI 10.1145/2462456.2464438]
[8]   A Survey on Activity Detection and Classification Using Wearable Sensors [J].
Cornacchia, Maria ;
Ozcan, Koray ;
Zheng, Yu ;
Velipasalar, Senem .
IEEE SENSORS JOURNAL, 2017, 17 (02) :386-403
[9]   Vehicle Mode and Driving Activity Detection Based on Analyzing Sensor Data of Smartphones [J].
Dang-Nhac Lu ;
Duc-Nhan Nguyen ;
Thi-Hau Nguyen ;
Ha-Nam Nguyen .
SENSORS, 2018, 18 (04)
[10]  
Demirel B, 2019, IEEE IMAGE PROC, P3656, DOI [10.1109/icip.2019.8803458, 10.1109/ICIP.2019.8803458]