C2FHAR: Coarse-to-Fine Human Activity Recognition With Behavioral Context Modeling Using Smart Inertial Sensors

被引:19
作者
Ehatisham-Ul-Haq, Muhammad [1 ]
Azam, Muhammad Awais [1 ,2 ]
Amin, Yasar [1 ]
Naeem, Usman [3 ]
机构
[1] UET, Fac Telecom & Informat Engn, Taxila 47050, Pakistan
[2] Whitecliffe Technol, Fac Informat Technol, Wellington 6011, New Zealand
[3] Queen Mary Univ London, Sch Elect Engn & Comp Sci, Fac Sci & Engn, London E1 4NS, England
关键词
Activity recognition; behavioral context; context-aware; machine learning; smart sensing; ACCELEROMETER DATA; WEARABLE SENSORS; DATA FUSION; MOBILE; CLASSIFICATION; ALGORITHMS; NETWORKS; FEATURES; SYSTEM;
D O I
10.1109/ACCESS.2020.2964237
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Smart sensing devices are furnished with an array of sensors, including locomotion sensors, which enable continuous and passive monitoring of human activities for the ambient assisted living. As a result, sensor-based human activity recognition has earned significant popularity in the past few years. A lot of successful research studies have been conducted in this regard. However, the accurate recognition of in-the-wild human activities in real-time is still a fundamental challenge to be addressed as human physical activity patterns are adversely affected by their behavioral contexts. Moreover, it is essential to infer a user & x2019;s behavioral context along with the physical activity to enable context-aware and knowledge-driven applications in real-time. Therefore, this research work presents & x201C;C2FHAR & x201D;, a novel approach for coarse-to-fine human activity recognition in-the-wild, which explicitly models the user & x2019;s behavioral contexts with activities of daily living to learn and recognize the fine-grained human activities. For addressing real-time activity recognition challenges, the proposed scheme utilizes a multi-label classification model for identifying in-the-wild human activities at two different levels, i.e., coarse or fine-grained, depending upon the real-time use-cases. The proposed scheme is validated with extensive experiments using heterogeneous sensors, which demonstrate its efficacy.
引用
收藏
页码:7731 / 7747
页数:17
相关论文
共 70 条
[1]   Activity Recognition with Evolving Data Streams: A Review [J].
Abdallah, Zahraa S. ;
Gaber, Mohamed Medhat ;
Srinivasan, Bala ;
Krishnaswamy, Shonali .
ACM COMPUTING SURVEYS, 2018, 51 (04)
[2]  
[Anonymous], J AMBIENT INTELL HUM
[3]  
[Anonymous], C4 5 PROGRAMS MACHIN
[4]   Hand, belt, pocket or bag: Practical activity tracking with mobile phones [J].
Antos, Stephen A. ;
Albert, Mark V. ;
Kording, Konrad P. .
JOURNAL OF NEUROSCIENCE METHODS, 2014, 231 :22-30
[5]   Physical Human Activity Recognition Using Wearable Sensors [J].
Attal, Ferhat ;
Mohammed, Samer ;
Dedabrishvili, Mariam ;
Chamroukhi, Faicel ;
Oukhellou, Latifa ;
Amirat, Yacine .
SENSORS, 2015, 15 (12) :31314-31338
[6]   Activity recognition from user-annotated acceleration data [J].
Bao, L ;
Intille, SS .
PERVASIVE COMPUTING, PROCEEDINGS, 2004, 3001 :1-17
[7]   HuMAn: Complex Activity Recognition with Multi-Modal Multi-Positional Body Sensing [J].
Bharti, Pratool ;
De, Debraj ;
Chellappan, Sriram ;
Das, Sajal K. .
IEEE TRANSACTIONS ON MOBILE COMPUTING, 2019, 18 (04) :857-870
[8]   Optimal classifier for imbalanced data using Matthews Correlation Coefficient metric [J].
Boughorbel, Sabri ;
Jarray, Fethi ;
El-Anbari, Mohammed .
PLOS ONE, 2017, 12 (06)
[9]   GCHAR: An efficient Group-based Context-aware human activity recognition on smartphone [J].
Cao, Liang ;
Wang, Yufeng ;
Zhang, Bo ;
Jin, Qun ;
Vasilakos, Athanasios V. .
JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2018, 118 :67-80
[10]   A Gait Recognition Method for Human Following in Service Robots [J].
Chi, Wenzheng ;
Wang, Jiaole ;
Meng, Max Q-H. .
IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2018, 48 (09) :1429-1440