Consistency Verification of Marker-Less Gait Assessment System for Stair Walking

被引:0
|
作者
Ogawa, Ami [1 ]
Yorozu, Ayanori [1 ]
Mita, Akira [1 ]
Takahashi, Masaki [1 ]
Georgoulas, Christos [2 ]
Bock, Thomas [2 ]
机构
[1] Keio Univ, Sch Sci Open & Environm Syst, Tokyo, Japan
[2] Tech Univ Munich, Chair Bldg Realizat & Robot, Munich, Germany
关键词
Stair walking; Marker-less gait measurement; Depth data; Kinect v2; VICON; AGE-RELATED DIFFERENCES; OLDER-ADULTS; DESCENT; KNEE; ASCENT; FORCES; RISK;
D O I
10.1007/978-3-319-31744-1_57
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
The number of elderly people is drastically increasing. To support them, the gait information is under the spotlight since it has the relationship between the fall risk and dementia. Among other scenarios, a relatively higher level of ability is needed for the stair walking as it requires balancing and loading. Conventionally, 3D motion capture devices have been used to acquire the parameters of stair walking. However, it is difficult to acquire daily parameters as the equipment needs complicated preparation and body-worn markers. In this study, we propose a system which can acquire daily stair walking parameters using only depth data obtained by Kinect v2 without restraining by markers. We confirmed the accuracy of our proposed system compared with a 3D motion capture system.
引用
收藏
页码:653 / 663
页数:11
相关论文
共 50 条
  • [21] Marker-less intra-fraction position verification of lung tumors with an EPID in cine mode
    Ionascu, D.
    Park, S.
    Killoran, J.
    Allen, A.
    Berbeco, R.
    MEDICAL PHYSICS, 2007, 34 (06) : 2528 - 2528
  • [22] A comparative study for the assessment of marker-less mixed reality applications for the operator training
    Brunzini, Agnese
    Ciccarelli, Marianna
    Sartini, Mikhailo
    Papetti, Alessandra
    Germani, Michele
    INTERNATIONAL JOURNAL OF COMPUTER INTEGRATED MANUFACTURING, 2024, 37 (12) : 1559 - 1581
  • [23] A novel dataset and deep learning-based approach for marker-less motion capture during gait
    Vafadar, Saman
    Skalli, Wafa
    Bonnet-Lebrun, Aurore
    Khalife, Marc
    Renaudin, Mathis
    Hamza, Amine
    Gajny, Laurent
    GAIT & POSTURE, 2021, 86 : 70 - 76
  • [24] An Evaluation of a 3D Multimodal Marker-Less Motion Analysis System
    Rodrigues, Thiago Braga
    Cathain, Ciaran O.
    Devine, Declan
    Moran, Kieran
    O'Connor, Noel E.
    Murray, Niall
    PROCEEDINGS OF THE 10TH ACM MULTIMEDIA SYSTEMS CONFERENCE (ACM MMSYS'19), 2019, : 213 - 221
  • [25] Determining the Location of a Concealed Handgun on the Human Body using Marker-Less Gait Analysis and Machine Learning
    Muriithi, Henry Muchiri
    Lukandu, Ismail Ateya
    Wanyembi, Gregory Wabuke
    2019 SECOND INTERNATIONAL CONFERENCE ON NEXT GENERATION COMPUTING APPLICATIONS 2019 (NEXTCOMP 2019), 2019,
  • [26] Developing a system for the production of marker-less transformants of Synechocystis sp PCC 6803
    Lui, Yuen Tin
    Purton, Saul
    NEW BIOTECHNOLOGY, 2016, 33 : S172 - S172
  • [27] Validation of Contact Measurement System for Wheelchair Tennis Propulsion Using Marker-Less Vision System
    Ferlinghetti, Enrico
    Salzmann, Inge
    Ghidelli, Marco
    Rietveld, Thomas
    Vegter, Riemer
    Lancini, Matteo
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73 : 1 - 10
  • [28] A vision-based marker-less pose estimation system for articulated construction robots
    Liang, Ci-Jyun
    Lundeen, Kurt M.
    McGee, Wes
    Menassa, Carol C.
    Lee, SangHyun
    Kamat, Vineet R.
    AUTOMATION IN CONSTRUCTION, 2019, 104 : 80 - 94
  • [29] Marker-less Vision System based on RGB Camera for Wheelchair Tennis Contact Detection
    Ferlinghetti, Enrico
    Braaksma, Jelmer
    Vegter, Riemer
    Lancini, Matteo
    2024 IEEE INTERNATIONAL SYMPOSIUM ON MEDICAL MEASUREMENTS AND APPLICATIONS, MEMEA 2024, 2024,
  • [30] Marker-less tracking system for multiple mice using Mask R-CNN
    Sakamoto, Naoaki
    Kakeno, Hitoshi
    Ozaki, Noriko
    Miyazaki, Yusuke
    Kobayashi, Koji
    Murata, Takahisa
    FRONTIERS IN BEHAVIORAL NEUROSCIENCE, 2023, 16