Learning architecture for the recognition of walking and prediction of gait period using wearable sensors

被引:7
作者
Martinez-Hernandez, Uriel [1 ]
Awad, Mohammed I. [2 ]
Dehghani-Sanij, Abbas A. [3 ]
机构
[1] Univ Bath, Dept Elect & Elect Engn, Ctr Autonomous Robot CENTAUR, Bath BA2 7AY, Avon, England
[2] Ain Shams Univ, Dept Mechatron Engn, Cairo, Egypt
[3] Univ Leeds, Sch Mech Engn, Leeds, W Yorkshire, England
基金
英国工程与自然科学研究理事会;
关键词
Activity recognition; Deep learning; Learning architectures; Wearable sensors; INTENT RECOGNITION; CLASSIFICATION; ALGORITHMS; LOCOMOTION; NETWORKS; SYSTEM;
D O I
10.1016/j.neucom.2021.10.044
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This work presents a novel learning architecture for the recognition and prediction of walking activity and gait period, respectively, using wearable sensors. This approach is composed of a Convolutional Neural Network (CNN), a Predicted Information Gain (PIG) module and an adaptive combination of infor-mation sources. The CNN provides the recognition of walking and gait periods. This information is used by the proposed PIG method to estimate the next most probable gait period along the gait cycle. The out-puts from the CNN and PIG modules are combined by a proposed adaptive process, which relies on data from the source that shows to be more reliable. This adaptive combination ensures that the learning architecture provides accurate recognition and prediction of walking activity and gait periods over time. The learning architecture uses data from an array of three inertial measurement units attached to the lower limbs of individuals. The validation of this work is performed by the recognition of level-ground walking, ramp ascent and ramp descent, and the prediction of gait periods. The recognition of walking activity and gait period is 100% and 98.63%, respectively, when the CNN model is employed alone. The recognition of gait periods achieves a 99.9% accuracy, when the PIG method and adaptive combination are also used. These results demonstrate the benefit of having a system capable of predicting or antici-pating the next information or event over time. Overall, the learning architecture offers an alternative approach for accurate activity recognition, which is essential for the development of wearable robots cap-able of reliably and safely assisting humans in activities of daily living. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页码:1 / 10
页数:10
相关论文
共 45 条
[21]   Learning and exploration in action-perception loops [J].
Little, Daniel Y. ;
Sommer, Friedrich T. .
FRONTIERS IN NEURAL CIRCUITS, 2013, 7
[22]   The perceptual shaping of anticipatory actions [J].
Maffei, Giovanni ;
Herreros, Ivan ;
Sanchez-Fibla, Marti ;
Friston, Karl J. ;
Verschure, Paul F. M. J. .
PROCEEDINGS OF THE ROYAL SOCIETY B-BIOLOGICAL SCIENCES, 2017, 284 (1869)
[23]   A Real-Time Gait Event Detection for Lower Limb Prosthesis Control and Evaluation [J].
Maqbool, H. F. ;
Husman, M. A. B. ;
Awad, M. I. ;
Abouhossein, A. ;
Iqbal, Nadeem ;
Dehghani-Sanij, A. A. .
IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2017, 25 (09) :1500-1509
[24]   Adaptive Bayesian inference system for recognition of walking activities and prediction of gait events using wearable sensors [J].
Martinez-Hernandez, Uriel ;
Dehghani-Sanij, Abbas A. .
NEURAL NETWORKS, 2018, 102 :107-119
[25]   Multisensory Wearable Interface for Immersion and Telepresence in Robotics [J].
Martinez-Hernandez, Uriel ;
Boorman, Luke W. ;
Prescott, Tony J. .
IEEE SENSORS JOURNAL, 2017, 17 (08) :2534-2541
[26]   Active sensorimotor control for tactile exploration [J].
Martinez-Hernandez, Uriel ;
Dodd, Tony J. ;
Evans, Mathew H. ;
Prescott, Tony J. ;
Lepora, Nathan F. .
ROBOTICS AND AUTONOMOUS SYSTEMS, 2017, 87 :15-27
[27]   Human-level control through deep reinforcement learning [J].
Mnih, Volodymyr ;
Kavukcuoglu, Koray ;
Silver, David ;
Rusu, Andrei A. ;
Veness, Joel ;
Bellemare, Marc G. ;
Graves, Alex ;
Riedmiller, Martin ;
Fidjeland, Andreas K. ;
Ostrovski, Georg ;
Petersen, Stig ;
Beattie, Charles ;
Sadik, Amir ;
Antonoglou, Ioannis ;
King, Helen ;
Kumaran, Dharshan ;
Wierstra, Daan ;
Legg, Shane ;
Hassabis, Demis .
NATURE, 2015, 518 (7540) :529-533
[28]   Convolutional Neural Networks and Long Short-Term Memory for skeleton-based human activity and hand gesture recognition [J].
Nunez, Juan C. ;
Cabido, Raul ;
Pantrigo, Juan J. ;
Montemayor, Antonio S. ;
Velez, Jose F. .
PATTERN RECOGNITION, 2018, 76 :80-94
[29]   Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: State of the art and research challenges [J].
Nweke, Henry Friday ;
Teh, Ying Wah ;
Al-Garadi, Mohammed Ali ;
Alo, Uzoma Rita .
EXPERT SYSTEMS WITH APPLICATIONS, 2018, 105 :233-261
[30]   Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition [J].
Ordonez, Francisco Javier ;
Roggen, Daniel .
SENSORS, 2016, 16 (01)