Enhancing fall risk assessment: instrumenting vision with deep learning during walks

被引:1
作者
Moore, Jason [1 ]
Catena, Robert [2 ]
Fournier, Lisa [2 ]
Jamali, Pegah [2 ]
Mcmeekin, Peter [3 ]
Stuart, Samuel [4 ]
Walker, Richard [5 ]
Salisbury, Thomas [6 ]
Godfrey, Alan [1 ]
机构
[1] Northumbria Univ, Dept Comp & Informat Sci, Newcastle Upon Tyne NE1 8ST, England
[2] Washington State Univ, Dept Kinesiol & Educ Psychol, Pullman, WA USA
[3] Northumbria Univ, Dept Nursing Midwifery & Hlth, Newcastle Upon Tyne, England
[4] Northumbria Univ, Dept Sport Exercise & Rehabil, Newcastle Upon Tyne, England
[5] Northumbria Healthcare NHS Fdn Trust, North Tyneside, England
[6] South Tyneside & Sunderland NHS Fdn Trust, Sunderland, England
关键词
Fall risk; Gait analysis; Object detection; Deep learning; Visual attention; GAIT; INDIVIDUALS; STABILITY; ALGORITHM; ATTENTION; POSTURE; BALANCE; STROKE; ISSUE;
D O I
10.1186/s12984-024-01400-2
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Background Falls are common in a range of clinical cohorts, where routine risk assessment often comprises subjective visual observation only. Typically, observational assessment involves evaluation of an individual's gait during scripted walking protocols within a lab to identify deficits that potentially increase fall risk, but subtle deficits may not be (readily) observable. Therefore, objective approaches (e.g., inertial measurement units, IMUs) are useful for quantifying high resolution gait characteristics, enabling more informed fall risk assessment by capturing subtle deficits. However, IMU-based gait instrumentation alone is limited, failing to consider participant behaviour and details within the environment (e.g., obstacles). Video-based eye-tracking glasses may provide additional insight to fall risk, clarifying how people traverse environments based on head and eye movements. Recording head and eye movements can provide insights into how the allocation of visual attention to environmental stimuli influences successful navigation around obstacles. Yet, manual review of video data to evaluate head and eye movements is time-consuming and subjective. An automated approach is needed but none currently exists. This paper proposes a deep learning-based object detection algorithm (VARFA) to instrument vision and video data during walks, complementing instrumented gait.Method The approach automatically labels video data captured in a gait lab to assess visual attention and details of the environment. The proposed algorithm uses a YoloV8 model trained on with a novel lab-based dataset.Results VARFA achieved excellent evaluation metrics (0.93 mAP50), identifying, and localizing static objects (e.g., obstacles in the walking path) with an average accuracy of 93%. Similarly, a U-NET based track/path segmentation model achieved good metrics (IoU 0.82), suggesting that the predicted tracks (i.e., walking paths) align closely with the actual track, with an overlap of 82%. Notably, both models achieved these metrics while processing at real-time speeds, demonstrating efficiency and effectiveness for pragmatic applications.Conclusion The instrumented approach improves the efficiency and accuracy of fall risk assessment by evaluating the visual allocation of attention (i.e., information about when and where a person is attending) during navigation, improving the breadth of instrumentation in this area. Use of VARFA to instrument vision could be used to better inform fall risk assessment by providing behaviour and context data to complement instrumented e.g., IMU data during gait tasks. That may have notable (e.g., personalized) rehabilitation implications across a wide range of clinical cohorts where poor gait and increased fall risk are common.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Enhancing stroke risk and prognostic timeframe assessment with deep learning and a broad range of retinal biomarkers
    Messica, Shvat
    Presil, Dan
    Hoch, Yaacov
    Lev, Tsvi
    Hadad, Aviel
    Katz, Or
    Owens, David R.
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 2024, 154
  • [2] Fall-risk assessment of aged workers using wearable inertial measurement units based on machine learning
    Kou, Junhui
    Xu, Xin
    Ni, Xiaoyang
    Ma, Shengkai
    Guo, Liangjie
    SAFETY SCIENCE, 2024, 176
  • [3] A Computer Vision-Based System to Help Health Professionals to Apply Tests for Fall Risk Assessment
    Blasco-Garcia, Jesus Damian
    Garcia-Lopez, Gabriel
    Jimenez-Munoz, Marta
    Lopez-Riquelme, Juan Antonio
    Feliu-Batlle, Jorge Juan
    Pavon-Pulido, Nieves
    Herrero, Maria-Trinidad
    SENSORS, 2024, 24 (06)
  • [4] Adaptive Deep Learning for a Vision-based Fall Detection
    Doulamis, Anastasios
    Doulamis, Nikolaos
    11TH ACM INTERNATIONAL CONFERENCE ON PERVASIVE TECHNOLOGIES RELATED TO ASSISTIVE ENVIRONMENTS (PETRA 2018), 2018, : 558 - 565
  • [5] Deep Learning for Fall Risk Assessment With Inertial Sensors: Utilizing Domain Knowledge in Spatio-Temporal Gait Parameters
    Tunca, Can
    Salur, Gulustu
    Ersoy, Cem
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2020, 24 (07) : 1994 - 2005
  • [6] Wearables and Deep Learning Classify Fall Risk From Gait in Multiple Sclerosis
    Meyer, Brett M.
    Tulipani, Lindsey J.
    Gurchiek, Reed D.
    Allen, Dakota A.
    Adamowicz, Lukas
    Larie, Dale
    Solomon, Andrew J.
    Cheney, Nick
    McGinnis, Ryan S.
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2021, 25 (05) : 1824 - 1831
  • [7] Does the balance strategy during walking in elderly persons show an association with fall risk assessment?
    Ohtsu, Hajime
    Yoshida, Shinya
    Minamisawa, Tadayoshi
    Katagiri, Natsuki
    Yamaguchi, Tomofumi
    Takahashi, Toshiaki
    Yomogida, Shin-ichi
    Kanzaki, Hideto
    JOURNAL OF BIOMECHANICS, 2020, 103
  • [8] Vision Based Real Time Monitoring System for Elderly Fall Event Detection Using Deep Learning
    Anitha, G.
    Priya, S. Baghavathi
    COMPUTER SYSTEMS SCIENCE AND ENGINEERING, 2022, 42 (01): : 87 - 103
  • [9] Enhancing Free-Living Fall Risk Assessment: Contextualizing Mobility Based IMU Data
    Moore, Jason
    Stuart, Samuel
    McMeekin, Peter
    Walker, Richard
    Celik, Yunus
    Pointon, Matthew
    Godfrey, Alan
    SENSORS, 2023, 23 (02)
  • [10] Comparing Machine Learning Approaches for Fall Risk Assessment
    Silva, Joana
    Madureira, Joao
    Tonelo, Claudia
    Baltazar, Daniela
    Silva, Catarina
    Martins, Anabela
    Alcobia, Carlos
    Sousa, Ines
    PROCEEDINGS OF THE 10TH INTERNATIONAL JOINT CONFERENCE ON BIOMEDICAL ENGINEERING SYSTEMS AND TECHNOLOGIES, VOL 4: BIOSIGNALS, 2017, : 223 - 230