A CNN-Based Method for Intent Recognition Using Inertial Measurement Units and Intelligent Lower Limb Prosthesis

被引:113
|
作者
Su, Ben-Yue [1 ]
Wang, Jie [1 ]
Liu, Shuang-Qing [2 ]
Sheng, Min [2 ]
Jiang, Jing [3 ]
Xiang, Kui [4 ]
机构
[1] Anqing Normal Univ, Sch Comp & Informat, Anqing 246133, Peoples R China
[2] Anqing Normal Univ, Sch Math & Computat Sci, Anqing 246133, Peoples R China
[3] Jiangsu Univ, Sch Comp Sci & Telecommun Engn, Zhenjiang 212013, Jiangsu, Peoples R China
[4] Wuhan Univ Technol, Sch Automat, Wuhan 430070, Hubei, Peoples R China
关键词
Intent recognition; lower limb prosthesis; inertial measurement unit (IMU); convolutional neural networks (CNN); swing phase; AMPUTEES; WALKING; CLASSIFICATION;
D O I
10.1109/TNSRE.2019.2909585
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Powered intelligent lower limb prosthesis can actuate the knee and ankle joints, allowing transfemoral amputees to perform seamless transitions between locomotion states with the help of an intent recognition system. However, prior intent recognition studies often installed multiple sensors on the prosthesis, and they employed machine learning techniques to analyze time-series datawith empirical features. We alternativelypropose a novel method for training an intent recognition system that provides natural transitions between level walk, stair ascent/descent, and ramp ascent/descent. Since the transition between two neighboring states is driven by motion intent, we aim to explore the mapping between the motion state of a healthy leg and an amputee'smotion intent before the upcoming transition of the prosthesis. We use inertial measurement units (IMUs) and put them on the healthy leg of lower limb amputees for monitoring its locomotion state. We analyze IMU data within the early swing phase of the healthy leg, and feed data into a convolutional neural network (CNN) to learn the feature mapping without expert participation. The proposed method can predict the motion intent of both unilateral amputees and the able-bodied, and help to adaptivelycalibrate the control strategy for actuating powered intelligentprosthesis in advance. The experimental results show that the recognition accuracy can reach a high level (94.15% for the able-bodied, 89.23% for amputees) on 13 classes ofmotion intent, containing five steady states on different terrains as well as eight transitional states among the steady states.
引用
收藏
页码:1032 / 1042
页数:11
相关论文
共 50 条
  • [1] CNN-Based Intention Recognition Using Body-Worn Inertial Measurement Units
    Bajraktari, Flake
    Rosskopf, Nikolas
    Pott, Peter P.
    2023 IEEE 36TH INTERNATIONAL SYMPOSIUM ON COMPUTER-BASED MEDICAL SYSTEMS, CBMS, 2023, : 760 - 765
  • [2] An Improved Motion Intent Recognition Method for Intelligent Lower Limb Prosthesis Driven by Inertial Motion Capture Data
    Su B.-Y.
    Wang J.
    Liu S.-Q.
    Sheng M.
    Xiang K.
    Zidonghua Xuebao/Acta Automatica Sinica, 2020, 46 (07): : 1517 - 1530
  • [3] Motion intent recognition of intelligent lower limb prosthesis based on GMM-HMM
    Sheng M.
    Liu S.
    Wang J.
    Su B.
    Yi Qi Yi Biao Xue Bao/Chinese Journal of Scientific Instrument, 2019, 40 (05): : 169 - 178
  • [4] Lower Limb Gait Activity Recognition Using Inertial Measurement Units for rehabilitation robotics
    Hamdi, Mohammed M.
    Awad, Mohammed I.
    Abdelhameed, Magdy M.
    Tolbah, Farid A.
    PROCEEDINGS OF THE 17TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS (ICAR), 2015, : 316 - 322
  • [5] Intent Recognition in a Powered Lower Limb Prosthesis Using Time History Information
    Young, Aaron J.
    Simon, Ann M.
    Fey, Nicholas P.
    Hargrove, Levi J.
    ANNALS OF BIOMEDICAL ENGINEERING, 2014, 42 (03) : 631 - 641
  • [6] Intent Recognition in a Powered Lower Limb Prosthesis Using Time History Information
    Aaron J. Young
    Ann M. Simon
    Nicholas P. Fey
    Levi J. Hargrove
    Annals of Biomedical Engineering, 2014, 42 : 631 - 641
  • [7] Real-time motion intent recognition of intelligent lower limb prosthesis based on improved template matching technique
    Sheng M.
    Liu S.-Q.
    Wang J.
    Su B.-Y.
    Kongzhi yu Juece/Control and Decision, 2020, 35 (09): : 2153 - 2161
  • [8] A Method for Improving CNN-Based Image Recognition Using DCGAN
    Fang, Wei
    Zhang, Feihong
    Sheng, Victor S.
    Ding, Yewen
    CMC-COMPUTERS MATERIALS & CONTINUA, 2018, 57 (01): : 167 - 178
  • [9] Intent recognition of power lower-limb prosthesis based on improved convolutional neural network
    Su B.-Y.
    Ni Y.
    Sheng M.
    Zhao L.-L.
    Kongzhi yu Juece/Control and Decision, 2021, 36 (12): : 3031 - 3038
  • [10] Motion Intent Recognition in Intelligent Lower Limb Prosthesis Using One-Dimensional Dual-Tree Complex Wavelet Transformsy
    Sheng, Min
    Wang, Wan-Jun
    Tong, Ting-Ting
    Yang, Yuan-Yuan
    Chen, Hui-Lin
    Su, Ben-Yue
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021