Indoor Pedestrian Positioning Method Based on Ultra-Wideband with a Graph Convolutional Network and Visual Fusion

被引:0
作者
Mu, Huizhen [1 ,2 ]
Yu, Chao [1 ,2 ,3 ,4 ]
Jiang, Shuna [1 ,2 ]
Luo, Yujing [1 ,2 ]
Zhao, Kun [1 ,2 ,3 ]
Chen, Wen [1 ,2 ,3 ,4 ]
机构
[1] East China Normal Univ, Engn Ctr SHMEC Space Informat, Shanghai 200241, Peoples R China
[2] East China Normal Univ, GNSS, Shanghai 200241, Peoples R China
[3] East China Normal Univ, Shanghai Key Lab Multidimens Informat Proc, Shanghai 200241, Peoples R China
[4] East China Normal Univ, Key Lab Geog Informat Sci, Minist Educ, Shanghai 200241, Peoples R China
基金
中国国家自然科学基金;
关键词
indoor fusion positioning; UWB; vision sensor; GCN; particle filter;
D O I
10.3390/s24206732
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
To address the challenges of low accuracy in indoor positioning caused by factors such as signal interference and visual distortions, this paper proposes a novel method that integrates ultra-wideband (UWB) technology with visual positioning. In the UWB positioning module, the powerful feature-extraction ability of the graph convolutional network (GCN) is used to integrate the features of adjacent positioning points and improve positioning accuracy. In the visual positioning module, the residual results learned from the bidirectional gate recurrent unit (Bi-GRU) network are compensated into the mathematical visual positioning model's solution results to improve the positioning results' continuity. Finally, the two positioning coordinates are fused based on particle filter (PF) to obtain the final positioning results and improve the accuracy. The experimental results show that the positioning accuracy of the proposed UWB positioning method based on a GCN is less than 0.72 m in a single UWB positioning, and the positioning accuracy is improved by 55% compared with the Chan-Taylor algorithm. The proposed visual positioning method based on Bi-GRU and residual fitting has a positioning accuracy of 0.42 m, 71% higher than the Zhang Zhengyou visual positioning algorithm. In the fusion experiment, 80% of the positioning accuracy is within 0.24 m, and the maximum error is 0.66 m. Compared with the single UWB and visual positioning, the positioning accuracy is improved by 56% and 52%, respectively, effectively enhancing indoor pedestrian positioning accuracy.
引用
收藏
页数:21
相关论文
共 25 条
  • [1] Anagnostopoulos Grigorios G., 2024, Zenodo, DOI [10.5281/zenodo.13684170, DOI 10.5281/ZENODO.13684170]
  • [2] COVID-19 & privacy: Enhancing of indoor localization architectures towards effective social distancing
    Barsocchi, Paolo
    Calabro, Antonello
    Crivello, Antonino
    Daoudagh, Said
    Furfari, Francesco
    Girolami, Michele
    Marchetti, Eda
    [J]. ARRAY, 2021, 9
  • [3] Precision Positioning for Smart Logistics Using Ultra-Wideband Technology-Based Indoor Navigation: A Review
    Elsanhoury, Mahmoud
    Makela, Petteri
    Koljonen, Janne
    Valisuo, Petri
    Shamsuzzoha, Ahm
    Mantere, Timo
    Elmusrati, Mohammed
    Kuusniemi, Heidi
    [J]. IEEE ACCESS, 2022, 10 : 44413 - 44445
  • [4] Integrating Indoor Localization Systems Through a Handoff Protocol
    Furfari, Francesco
    Girolami, Michele
    Barsocchi, Paolo
    [J]. IEEE Journal on Indoor and Seamless Positioning and Navigation, 2024, 2 : 130 - 142
  • [5] Furfari F., 2023, P 2023 13 INT C IND, P1, DOI [10.1109/IPIN57070.2023.10332479, DOI 10.1109/IPIN57070.2023.10332479]
  • [6] Discovering location based services: A unified approach for heterogeneous indoor localization systems
    Furfari, Francesco
    Crivello, Antonino
    Baronti, Paolo
    Barsocchi, Paolo
    Girolami, Michele
    Palumbo, Filippo
    Quezada-Gaibor, Darwin
    Mendoza Silva, German M.
    Torres-Sospedra, Joaquin
    [J]. INTERNET OF THINGS, 2021, 13
  • [7] Girolami Michele, 2024, IEEE Journal of Indoor and Seamless Positioning and Navigation, P36, DOI 10.1109/JISPIN.2023.3345268
  • [8] Gong Y., 2022, P 2022 3 INT C COMP, P199, DOI [10.1109/CVIDLICCEA56201.2022.9824361, DOI 10.1109/CVIDLICCEA56201.2022.9824361]
  • [9] Hu W., 2023, Manufacturing Automation, V45, P193, DOI DOI 10.3969/J.ISSN.1009-0134.2023.02.039