Human activity recognition based on fusing inertial sensors with an optical receiver

被引:0
|
作者
Salem, Ziad [1 ]
Lichtenegger, Felix [2 ]
Weiss, Andreas P. [1 ]
Leiner, Claude [2 ]
Sommer, Christian [2 ]
Wenzl, Franz P. [1 ]
机构
[1] Joanneum Res Forsch mbH, Inst Surface Technol & Photon, Smart Connected Lighting, Ind Str 6, A-7423 Pinkafeld, Austria
[2] Joanneum Res Forschungsges mbH, Inst Surface Technol & Photon, Light & Opt Technol, Franz Pichler Str 30, A-8160 Weiz, Austria
来源
OPTICAL SENSING AND DETECTION VII | 2022年 / 12139卷
关键词
Human activity recognition; inertial measurement unit sensors; optical receiver; optical simulation; sensors data fusion; machine learning;
D O I
10.1117/12.2621187
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The research on Human Activity Recognition (HAR) systems has received high attention due to its importance in high demanding and challenging fields of study such as health care, social science, robotics and artificial intelligence. One of the most prominent approaches is to use Inertial Measurement Unit (IMU) sensors in order to determine what activity a human is making. If complex activities such as sit-down, stand-up, walk-up and walk-down are needed to be recognized, the user needs to wear multiple sensors on his/her body to perform a correct recognition. Such activity recognition will be of high interest if the object's position is also recognized. For recognizing the activity and location properly, a decent fusion technique between the multiple sources of information is required. In this study, we propose a novel positioning and HAR system based on fusing data from a single IMU device with data from a simulated segmented optical receiver to perform visible light positioning (VLP). We combine real world data collected from the IMU device with optical simulation data generated from a simulated segmented optical receiver in order to distinguish between various complex activities, particularly walk, walk-up and walk-down in addition to determining the position of where the activity is performed. The fusion mechanism does not only improve the accuracy of the activity recognition in comparison to utilizing either IMU or optical data alone, but also enables the system to furthermore retrieve the user's position in the room. By applying different Machine-learning (ML) algorithms for the assessment of the achievable results, we conduct a comprehensive analysis on which ML method is suitable for our envisioned low-complex HAR and positioning system, which avoids the placement of multiple sensors on the user's body. Our results show the influence of different segmentation strategies for the novel concept of a segmented optical receiver in combination with an IMU sensor on the accuracy of the activity and position recognition.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] Human Activity Recognition Based on Wireless Electrocardiogram and Inertial Sensors
    Farrokhi, Sajad
    Dargie, Waltenegus
    Poellabauer, Christian
    IEEE SENSORS JOURNAL, 2024, 24 (05) : 6490 - 6499
  • [2] Human Activity Recognition Based on Symbolic Representation Algorithms for Inertial Sensors
    Lima, Wesllen Sousa
    de Souza Braganca, Hendrio L.
    Montero Quispe, Kevin G.
    Pereira Souto, Eduardo J.
    SENSORS, 2018, 18 (11)
  • [3] A novel recognition approach for mobile image fusing inertial sensors
    Gui, Zhen-Wen
    Wu, Ting
    Peng, Xin
    Zidonghua Xuebao/Acta Automatica Sinica, 2015, 41 (08): : 1394 - 1404
  • [4] Research on the Taxonomy of Activity Recognition Based on Inertial Sensors
    Xiao Zi-ming
    Shi Yu-long
    Xue Yong
    Hu Feng
    Wu Yu-chuan
    COMPUTING, CONTROL AND INDUSTRIAL ENGINEERING IV, 2013, 823 : 107 - 110
  • [5] Human Activity Recognition Using Inertial Sensors in a Smartphone: An Overview
    Lima, Wesllen Sousa
    Souto, Eduardo
    El-Khatib, Khalil
    Jalali, Roozbeh
    Gama, Joao
    SENSORS, 2019, 19 (14)
  • [6] Temporal Approaches for Human Activity Recognition using Inertial Sensors
    Garcia, Felipe Aparecido
    Ranieri, Caetano Mazzoni
    Romero, Roseli A. F.
    2019 LATIN AMERICAN ROBOTICS SYMPOSIUM, 2019 BRAZILIAN SYMPOSIUM ON ROBOTICS (SBR) AND 2019 WORKSHOP ON ROBOTICS IN EDUCATION (LARS-SBR-WRE 2019), 2019, : 121 - 125
  • [7] Fusing Object Information and Inertial Data for Activity Recognition
    Diete, Alexander
    Stuckenschmidt, Heiner
    SENSORS, 2019, 19 (19)
  • [8] Activity Recognition Based on Inertial Sensors for Ambient Assisted Living
    Davis, Kadian
    Owusu, Evans
    Bastani, Vahid
    Marcenaro, Lucio
    Hu, Jun
    Regazzoni, Carlo
    Feijs, Loe
    2016 19TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2016, : 371 - 378
  • [9] Human Action Recognition Based on Inertial Sensors and Complexity Classification
    Liu, Lijue
    Lei, Xiaoliang
    Chen, Baifan
    Shu, Lei
    JOURNAL OF INFORMATION TECHNOLOGY RESEARCH, 2019, 12 (01) : 18 - 35
  • [10] Recognition of Human Motion Pattern Based on MEMS Inertial Sensors
    Liu, Yong-qing
    Hao, Chun-chao
    Wang, Lei
    Gong, Zhi-jun
    2018 INTERNATIONAL CONFERENCE ON ELECTRICAL, CONTROL, AUTOMATION AND ROBOTICS (ECAR 2018), 2018, 307 : 422 - 426