USING OPTICAL FLOW FOR FILLING THE GAPS IN VISUAL-INERTIAL TRACKING

被引:0
|
作者
Bleser, Gabriele [1 ]
Hendeby, Gustaf [1 ]
机构
[1] German Res Ctr Artificial Intelligence, Dept Augmented Vis, D-67663 Kaiserslautern, Germany
关键词
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
During the last decades egomotion tracking has been an often addressed problem. Hybrid approaches evidentially have potential to provide accurate, efficient and robust results. Simultaneous localisation and mapping (SLAM) - in contrast to model-based approaches - is used to enable tracking in unknown environments. However, it also suffers from high computational complexity. Moreover, in many applications, the map itself is not needed and the target environment is partially known, e.g. in a few 3D anchor points. In this paper, rather than using SLAM, optical flow measurements are introduced into a model-based system. With these measurements, a modified visual-inertial tracking method is derived, which in Monte Carlo simulations reduces the need for 3D points and thus allows tracking during extended gaps of 3D point registrations.
引用
收藏
页码:1836 / 1840
页数:5
相关论文
共 50 条
  • [21] Visual-Inertial Direct SLAM
    Concha, Alejo
    Loianno, Giuseppe
    Kumar, Vijay
    Civera, Javier
    2016 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2016, : 1331 - 1338
  • [22] Robocentric Visual-Inertial Odometry
    Huai, Zheng
    Huang, Guoquan
    2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2018, : 6319 - 6326
  • [23] Visual-inertial teach and repeat
    Nitsche, Matias
    Pessacg, Facundo
    Civera, Javier
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2020, 131
  • [24] Visual-Inertial Curve SLAM
    Meier, Kevin
    Chung, Soon-Jo
    Hutchinson, Seth
    2016 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2016), 2016, : 1238 - 1245
  • [25] A Visual-Inertial Dynamic Object Tracking SLAM Tightly Coupled System
    Zhang, Hanxuan
    Wang, Dingyi
    Huo, Ju
    IEEE SENSORS JOURNAL, 2023, 23 (17) : 19905 - 19917
  • [26] The Visual-Inertial Canoe Dataset
    Miller, Martin
    Chung, Soon-Jo
    Hutchinson, Seth
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2018, 37 (01): : 13 - 20
  • [27] Robocentric visual-inertial odometry
    Huai, Zheng
    Huang, Guoquan
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2022, 41 (07): : 667 - 689
  • [28] Using Vanishing Points to Improve Visual-Inertial Odometry
    Camposeco, Federico
    Pollefeys, Marc
    2015 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2015, : 5219 - 5225
  • [29] Cooperative Visual-Inertial Odometry
    Zhu, Pengxiang
    Yang, Yulin
    Ren, Wei
    Huang, Guoquan
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 13135 - 13141
  • [30] Visual-inertial SLAM method based on multi-scale optical flow fusion feature point
    Wang T.
    Liu J.
    Wu Z.
    Shen Q.
    Yao E.
    Xi Tong Gong Cheng Yu Dian Zi Ji Shu/Systems Engineering and Electronics, 2022, 44 (03): : 977 - 985