Trunk Motion System (TMS) Using Printed Body Worn Sensor (BWS) via Data Fusion Approach

被引:32
|
作者
Esfahani, Mohammad Iman Mokhlespour [1 ,2 ]
Zobeiri, Omid [3 ]
Moshiri, Behzad [4 ]
Narimani, Roya [2 ]
Mehravar, Mohammad [5 ]
Rashedi, Ehsan [6 ]
Parnianpour, Mohamad [2 ]
机构
[1] Virginia Polytech Inst & State Univ, Dept Ind & Syst Engn, Blacksburg, VA 24061 USA
[2] Sharif Univ Technol, Sch Mech Engn, Lab Wearable Technol & Neuromusculoskeletal Res, Tehran 111559567, Iran
[3] McGill Univ, Dept Biomed Engn, Montreal, PQ H3A 2B4, Canada
[4] Univ Tehran, Sch Elect & Comp Engn, Ctr Excellence, Control & Intelligent Proc, Tehran 14395515, Iran
[5] Ahvaz Jundishapur Univ Med Sci, Musculoskeletal Rehabil Res Ctr, Ahvaz 6135733133, Iran
[6] Rochester Inst Technol, Dept Ind & Syst Engn, Rochester, NY 14623 USA
基金
美国国家科学基金会;
关键词
wearable system; body worn sensor; trunk movement; sensor fusion; PHYSICAL-ACTIVITY; WEARABLE SENSORS; PLACEMENT; STRAIN; MOVEMENT; SHOULDER; TEXTILES;
D O I
10.3390/s17010112
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Human movement analysis is an important part of biomechanics and rehabilitation, for which many measurement systems are introduced. Among these, wearable devices have substantial biomedical applications, primarily since they can be implemented both in indoor and outdoor applications. In this study, a Trunk Motion System (TMS) using printed Body-Worn Sensors (BWS) is designed and developed. TMS can measure three-dimensional (3D) trunk motions, is lightweight, and is a portable and non-invasive system. After the recognition of sensor locations, twelve BWSs were printed on stretchable clothing with the purpose of measuring the 3D trunk movements. To integrate BWSs data, a neural network data fusion algorithm was used. The outcome of this algorithm along with the actual 3D anatomical movements (obtained by Qualisys system) were used to calibrate the TMS. Three healthy participants with different physical characteristics participated in the calibration tests. Seven different tasks (each repeated three times) were performed, involving five planar, and two multiplanar movements. Results showed that the accuracy of TMS system was less than 1.0 degrees, 0.8 degrees, 0.6 degrees, 0.8 degrees, 0.9 degrees, and 1.3 degrees for flexion/extension, left/right lateral bending, left/right axial rotation, and multi-planar motions, respectively. In addition, the accuracy of TMS for the identified movement was less than 2.7 degrees. TMS, developed to monitor and measure the trunk orientations, can have diverse applications in clinical, biomechanical, and ergonomic studies to prevent musculoskeletal injuries, and to determine the impact of interventions.
引用
收藏
页数:16
相关论文
共 17 条
  • [1] A Sensor Fusion Based Robotic System Architecture using Human Interaction for Motion Control
    Ruiz, Ariel Y. Ramos
    Rivera, Luis J. Figueroa
    Chandrasekaran, Balasubramaniyan
    2019 IEEE 9TH ANNUAL COMPUTING AND COMMUNICATION WORKSHOP AND CONFERENCE (CCWC), 2019, : 95 - 100
  • [2] Advanced data-driven FBG sensor-based pavement monitoring system using multi-sensor data fusion and an unsupervised learning approach
    Golmohammadi, Ali
    Hernando, David
    van den Bergh, Wim
    Hasheminejad, Navid
    MEASUREMENT, 2025, 242
  • [3] Classifying tasks performed by electrical line workers using a wrist-worn sensor: A data analytic approach
    Lamooki, Saeb Ragani
    Hajifar, Sahand
    Hannan, Jacqueline
    Sun, Hongyue
    Megahed, Fadel
    Cavuoto, Lora
    PLOS ONE, 2022, 17 (12):
  • [4] An Embodied Multi-Sensor Fusion Approach to Visual Motion Estimation Using Unsupervised Deep Networks
    Shamwell, E. Jared
    Nothwang, William D.
    Perlis, Donald
    SENSORS, 2018, 18 (05)
  • [5] A Multi-sensor Data Fusion Approach for Sleep Apnea Monitoring using Neural Networks
    Premasiri, Swapna
    de Silva, Clarence W.
    Gamage, Lalith B.
    2018 IEEE 14TH INTERNATIONAL CONFERENCE ON CONTROL AND AUTOMATION (ICCA), 2018, : 470 - 475
  • [6] Implementation of a Sensor Fusion Based Robotic System Architecture for Motion Control using Human-Robot Interaction
    Ruiz, Ariel Y. Ramos
    Chandrasekaran, Balasubramaniyan
    2020 IEEE/SICE INTERNATIONAL SYMPOSIUM ON SYSTEM INTEGRATION (SII), 2020, : 405 - 409
  • [7] Artificial Neural Network Approach to Guarantee the Positioning Accuracy of Moving Robots by Using the Integration of IMU/UWB with Motion Capture System Data Fusion
    Almassri, Ahmed M. M.
    Shirasawa, Natsuki
    Purev, Amarbold
    Uehara, Kaito
    Oshiumi, Wataru
    Mishima, Satoru
    Wagatsuma, Hiroaki
    SENSORS, 2022, 22 (15)
  • [8] Real-Time Postural Disturbance Detection Through Sensor Fusion of EEG and Motion Data Using Machine Learning
    Wang, Zhuo
    Noah, Avia
    Graci, Valentina
    Keshner, Emily A.
    Griffith, Madeline
    Seacrist, Thomas
    Burns, John
    Gal, Ohad
    Guez, Allon
    SENSORS, 2024, 24 (23)
  • [9] Application of Inertial Measurement Units for Advanced Safety Surveillance System using Individualized Sensor Technology (ASSIST): A Data Fusion and Machine Learning Approach
    Baghdadi, Amir
    2018 IEEE INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS (ICHI), 2018, : 450 - 451
  • [10] Approach to Explosive Hazard Detection Using Sensor Fusion and Multiple Kernel Learning with Downward-Looking GPR and EMI Sensor Data
    Pinar, Anthony
    Masarik, Matthew
    Havens, Timothy C.
    Burns, Joseph
    Thelen, Brian
    Becker, John
    DETECTION AND SENSING OF MINES, EXPLOSIVE OBJECTS, AND OBSCURED TARGETS XX, 2015, 9454