Sensor-Data Fusion for Multi-Person Indoor Location Estimation

被引:13
|
作者
Mohebbi, Parisa [1 ]
Stroulia, Eleni [1 ]
Nikolaidis, Ioanis [1 ]
机构
[1] Univ Alberta, Dept Comp Sci, Edmonton, AB T6G 2R3, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
indoor localization; activities of daily living; activity recognition; sensor fusion; passive infrared (PIR) sensors; Bluetooth Low-Energy (BLE); BLE beacons; Estimote; anonymous sensing; eponymous sensing; TRACKING;
D O I
10.3390/s17102377
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
We consider the problem of estimating the location of people as they move and work in indoor environments. More specifically, we focus on the scenario where one of the persons of interest is unable or unwilling to carry a smartphone, or any other wearable device, which frequently arises in caregiver/cared-for situations. We consider the case of indoor spaces populated with anonymous binary sensors (Passive Infrared motion sensors) and eponymous wearable sensors (smartphones interacting with Estimote beacons), and we propose a solution to the resulting sensor-fusion problem. Using a data set with sensor readings collected from one-person and two-person sessions engaged in a variety of activities of daily living, we investigate the relative merits of relying solely on anonymous sensors, solely on eponymous sensors, or on their combination. We examine how the lack of synchronization across different sensing sources impacts the quality of location estimates, and discuss how it could be mitigated without resorting to device-level mechanisms. Finally, we examine the trade-off between the sensors' coverage of the monitored space and the quality of the location estimates.
引用
收藏
页数:21
相关论文
共 50 条
  • [1] Multi-person 3D pose estimation from unlabelled data
    Rodriguez-Criado, Daniel
    Bachiller-Burgos, Pilar
    Vogiatzis, George
    Manso, Luis J.
    MACHINE VISION AND APPLICATIONS, 2024, 35 (03)
  • [2] Multi-person Location and Tracking Method Based on BP Neural Network
    Pan Wei
    Liu Zhizhan
    Zou Yi
    2008 IEEE CONFERENCE ON CYBERNETICS AND INTELLIGENT SYSTEMS, VOLS 1 AND 2, 2008, : 932 - +
  • [3] An RGB/Infra-Red camera fusion approach for Multi-Person Pose Estimation in low light environments
    Crescitelli, Viviana
    Kosuge, Atsutake
    Oshima, Takashi
    2020 IEEE SENSORS APPLICATIONS SYMPOSIUM (SAS 2020), 2020,
  • [4] Multi-Modal Sensor Fusion for Indoor Mobile Robot Pose Estimation
    Dobrev, Yassen
    Flores, Sergio
    Vossiek, Martin
    PROCEEDINGS OF THE 2016 IEEE/ION POSITION, LOCATION AND NAVIGATION SYMPOSIUM (PLANS), 2016, : 553 - 556
  • [5] Indoor Localization with multi sensor data fusion in ad hoc mobile scenarios
    Minutolo, Riccardo
    Annoni, Luca Alfredo
    2014 IEEE INTERNATIONAL CONFERENCE ON ULTRA-WIDEBAND (ICUWB), 2014, : 403 - 408
  • [6] Design of a Hybrid Indoor Location System Based on Multi-Sensor Fusion for Robot Navigation
    Shi, Yongliang
    Zhang, Weimin
    Yao, Zhuo
    Li, Mingzhu
    Liang, Zhenshuo
    Cao, Zhongzhong
    Zhang, Hua
    Huang, Qiang
    SENSORS, 2018, 18 (10)
  • [7] Shape-aware Multi-Person Pose Estimation from Multi-View Images
    Dong, Zijian
    Song, Jie
    Chen, Xu
    Guo, Chen
    Hilliges, Otmar
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 11138 - 11148
  • [8] Train distance and speed estimation using multi sensor data fusion
    Muniandi, Ganesan
    Deenadayalan, Ezhilarasi
    IET RADAR SONAR AND NAVIGATION, 2019, 13 (04) : 664 - 671
  • [9] Resource-aware strategies for real-time multi-person pose estimation
    Esmail, Mohammed A.
    Wang, Jinlei
    Wang, Yihao
    Sun, Li
    Zhu, Guoliang
    Zhang, Guohe
    IMAGE AND VISION COMPUTING, 2025, 155
  • [10] RAV4D: A Radar-Audio-Visual Dataset for Indoor Multi-Person Tracking
    Zhou, Yi
    Song, Ningfei
    Ma, Jieming
    Man, Ka Lok
    Lopez-Benitez, Miguel
    Yu, Limin
    Yue, Yutao
    2024 IEEE RADAR CONFERENCE, RADARCONF 2024, 2024,