Human activity recognition based on fusing inertial sensors with an optical receiver

被引:0
|
作者
Salem, Ziad [1 ]
Lichtenegger, Felix [2 ]
Weiss, Andreas P. [1 ]
Leiner, Claude [2 ]
Sommer, Christian [2 ]
Wenzl, Franz P. [1 ]
机构
[1] Joanneum Res Forsch mbH, Inst Surface Technol & Photon, Smart Connected Lighting, Ind Str 6, A-7423 Pinkafeld, Austria
[2] Joanneum Res Forschungsges mbH, Inst Surface Technol & Photon, Light & Opt Technol, Franz Pichler Str 30, A-8160 Weiz, Austria
来源
OPTICAL SENSING AND DETECTION VII | 2022年 / 12139卷
关键词
Human activity recognition; inertial measurement unit sensors; optical receiver; optical simulation; sensors data fusion; machine learning;
D O I
10.1117/12.2621187
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The research on Human Activity Recognition (HAR) systems has received high attention due to its importance in high demanding and challenging fields of study such as health care, social science, robotics and artificial intelligence. One of the most prominent approaches is to use Inertial Measurement Unit (IMU) sensors in order to determine what activity a human is making. If complex activities such as sit-down, stand-up, walk-up and walk-down are needed to be recognized, the user needs to wear multiple sensors on his/her body to perform a correct recognition. Such activity recognition will be of high interest if the object's position is also recognized. For recognizing the activity and location properly, a decent fusion technique between the multiple sources of information is required. In this study, we propose a novel positioning and HAR system based on fusing data from a single IMU device with data from a simulated segmented optical receiver to perform visible light positioning (VLP). We combine real world data collected from the IMU device with optical simulation data generated from a simulated segmented optical receiver in order to distinguish between various complex activities, particularly walk, walk-up and walk-down in addition to determining the position of where the activity is performed. The fusion mechanism does not only improve the accuracy of the activity recognition in comparison to utilizing either IMU or optical data alone, but also enables the system to furthermore retrieve the user's position in the room. By applying different Machine-learning (ML) algorithms for the assessment of the achievable results, we conduct a comprehensive analysis on which ML method is suitable for our envisioned low-complex HAR and positioning system, which avoids the placement of multiple sensors on the user's body. Our results show the influence of different segmentation strategies for the novel concept of a segmented optical receiver in combination with an IMU sensor on the accuracy of the activity and position recognition.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] Out-of-Distribution Detection of Human Activity Recognition with Smartwatch Inertial Sensors
    Boyer, Philip
    Burns, David
    Whyne, Cari
    SENSORS, 2021, 21 (05) : 1 - 23
  • [22] A hierarchical method for human concurrent activity recognition using miniature inertial sensors
    Chen, Ye
    Wang, Zhelong
    SENSOR REVIEW, 2017, 37 (01) : 101 - 109
  • [23] Human Activity Recognition Using Inertial, Physiological and Environmental Sensors: A Comprehensive Survey
    Demrozi, Florenc
    Pravadelli, Graziano
    Bihorac, Azra
    Rashidi, Parisa
    IEEE ACCESS, 2020, 8 : 210816 - 210836
  • [24] An efficient deep learning-based approach for human activity recognition using smartphone inertial sensors
    Djemili R.
    Zamouche M.
    International Journal of Computers and Applications, 2023, 45 (04) : 323 - 336
  • [25] Relative Pose Estimation of Underwater Robot by Fusing Inertial sensors and Optical image
    Choi, Jinwoo
    Kim, Seokyong
    Lee, Yeongjun
    Kim, Tae-Jin
    Choi, Hyun-Taek
    2014 11TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAI), 2014, : 204 - 208
  • [26] A Tutorial on Human Activity Recognition Using Body-Worn Inertial Sensors
    Bulling, Andreas
    Blanke, Ulf
    Schiele, Bernt
    ACM COMPUTING SURVEYS, 2014, 46 (03)
  • [27] Human Activity Classification with Inertial Sensors
    Silva, Joana
    Monteiro, Miguel
    Sousa, Filipe
    PHEALTH 2014, 2014, 200 : 101 - 104
  • [28] A Survey of Activity Recognition Process Using Inertial sensors and Smartphone Sensors
    Shweta
    Khandnor, Padmavati
    Kumar, Neelesh
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTING, COMMUNICATION AND AUTOMATION (ICCCA), 2017, : 607 - 612
  • [29] Activity Recognition using Head Worn Inertial Sensors
    Wolff, Johann P.
    Gruetzmacher, Florian
    Wellnitz, Arne
    Haubelt, Christian
    5TH INTERNATIONAL WORKSHOP ON SENSOR-BASED ACTIVITY RECOGNITION AND INTERACTION (IWOAR 2018), 2018,
  • [30] Recognition of human locomotion on various transportations fusing smartphone sensors
    Antar, Anindya Das
    Ahmed, Masud
    Ahad, Md Atiqur Rahman
    PATTERN RECOGNITION LETTERS, 2021, 148 : 146 - 153