Multispectral Visual-Inertial Navigation Using a Dual-Layer Estimator and Targeted Histogram Equalization

被引:0
|
作者
Boler, Matthew [1 ]
Martin, Scott [2 ]
机构
[1] Auburn Univ, GPS & Vehicle Dynam Lab GAVLab, Auburn, AL 36849 USA
[2] Auburn Univ, GPS & Vehicle Dynam Lab, Auburn, AL 36849 USA
来源
PROCEEDINGS OF THE 34TH INTERNATIONAL TECHNICAL MEETING OF THE SATELLITE DIVISION OF THE INSTITUTE OF NAVIGATION (ION GNSS+ 2021) | 2021年
关键词
ROBUST;
D O I
10.33012/2021.18082
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
The fusion of a camera and an inertial measurement unit (IMU) is a rapidly-growing approach to GPS-denied navigation due to its minimal size, weight, and power requirements. In order to become a standard approach to for robust GPS-denied navigation, visual-inertial estimation must be able to perform in all situations, especially those present in challenging environments. Despite the popularity and wide-reaching applicability of the field, most visual-inertial research has focused on a standard platform and problem of a small aerial vehicle equipped with a machine-vision camera and inexpensive IMU operating in a well-lit indoor environment. As a result, the literature on extending visual-inertial estimation to illumination-challenged environments such as poorly-lit indoor scenarios and nighttime outdoor scenarios is shallow. These situations present significant challenges to visual navigation systems as their lack of contrast and illumination results in a reduced ability to extract and match features, therefore reducing the availability of valid visual measurements. To enable operation in such challenging situations, this paper presents a visual-inertial estimator which successfully operates with a visual-spectrum camera in poorly-lit environments and a thermal-infrared camera in illumination-denied environments. This success is achieved by a modification of the standard visual feature extraction and matching process which improves the robustness of features to contrast and illumination changes. After matching and processing such features, further robustness and accuracy improvements are achieved by the implementation of a dual-layer estimator which employs a robust and efficient EKF-based frontend to preprocess and validate incoming measurements before passing them into a batch least-squares optimizing backend. Solutions from the backend are periodically fed back into the frontend to correct accumulated error and maintain accurate real-time state estimates.
引用
收藏
页码:3149 / 3161
页数:13
相关论文
共 23 条
  • [1] Robust Multispectral Visual-Inertial Navigation With Visual Odometry Failure Recovery
    Beauvisage, Axel
    Ahiska, Kenan
    Aouf, Nabil
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (07) : 9089 - 9101
  • [2] VIMOT: A Tightly Coupled Estimator for Stereo Visual-Inertial Navigation and Multiobject Tracking
    Feng, Shaoquan
    Li, Xingxing
    Xia, Chunxi
    Liao, Jianchi
    Zhou, Yuxuan
    Li, Shengyu
    Hua, Xianghong
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [3] MOBILE ROBOT NAVIGATION USING MONOCULAR VISUAL-INERTIAL FUSION
    Cai, Jianxian
    Gao, Penggang
    Wu, Yanxiong
    Gao, Zhitao
    MECHATRONIC SYSTEMS AND CONTROL, 2021, 49 (01): : 36 - 40
  • [4] Autonomous aerial navigation using monocular visual-inertial fusion
    Lin, Yi
    Gao, Fei
    Qin, Tong
    Gao, Wenliang
    Liu, Tianbo
    Wu, William
    Yang, Zhenfei
    Shen, Shaojie
    JOURNAL OF FIELD ROBOTICS, 2018, 35 (01) : 23 - 51
  • [5] Performance improvement of visual-inertial navigation system by using polarized light compass
    Kong, Xianglong
    Wu, Wenqi
    Zhang, Lilian
    He, Xiaofeng
    Wang, Yujie
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2016, 43 (06): : 588 - 595
  • [6] A Sparse Direct Visual-Inertial Method for Pedestrian Navigation Using Smartphone Sensors
    Wang, Zhaosheng
    Qian, Jiuchao
    Wang, Yuze
    Liu, Peilin
    Yu, Wenxian
    PROCEEDINGS OF THE 30TH INTERNATIONAL TECHNICAL MEETING OF THE SATELLITE DIVISION OF THE INSTITUTE OF NAVIGATION (ION GNSS+ 2017), 2017, : 3311 - 3320
  • [7] Far feature detection using geometric uncertainty in visual-inertial navigation system
    Lee H.Y.
    Jung J.H.
    Park C.G.
    Journal of Institute of Control, Robotics and Systems, 2021, 27 (11) : 794 - 800
  • [8] Visual-inertial navigation algorithm development using photorealistic camera simulation in the loop
    Sayre-McCord, Thomas
    Guerra, Winter
    Antonini, Amado
    Arneberg, Jasper
    Brown, Austin
    Cavalheiro, Guilherme
    Fang, Yajun
    Gorodetsky, Alex
    McCoy, Dave
    Quilter, Sebastian
    Riether, Fabian
    Tal, Ezra
    Terzioglu, Yunus
    Carlone, Luca
    Karaman, Sertac
    2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2018, : 2566 - 2573
  • [9] Consistent EKF-Based Visual-Inertial Navigation Using Points and Lines
    Heo, Sejong
    Jung, Jae Hyung
    Park, Chan Gook
    IEEE SENSORS JOURNAL, 2018, 18 (18) : 7638 - 7649
  • [10] Metric Visual-Inertial Navigation System Using Single Optical Flow Feature
    Omari, Sammy
    Ducard, Guillaume
    2013 EUROPEAN CONTROL CONFERENCE (ECC), 2013, : 1310 - 1316