Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments

被引:22
作者
Lin, Shanggang [1 ,2 ]
Jin, Lianwen [1 ,2 ]
Chen, Ziwei [1 ,2 ]
机构
[1] South China Univ Technol, Sch Elect & Informat Engn, Guangzhou 510640, Peoples R China
[2] South China Univ Technol, Zhuhai Inst Modern Ind Innovat, Zhuhai 519175, Peoples R China
基金
中国国家自然科学基金;
关键词
unmanned aerial vehicle; autonomous landing; low-illumination; marker detection; real-time; UNMANNED AERIAL VEHICLE; TARGET; QUADROTOR; TRACKING; FLIGHT;
D O I
10.3390/s21186226
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Landing an unmanned aerial vehicle (UAV) autonomously and safely is a challenging task. Although the existing approaches have resolved the problem of precise landing by identifying a specific landing marker using the UAV's onboard vision system, the vast majority of these works are conducted in either daytime or well-illuminated laboratory environments. In contrast, very few researchers have investigated the possibility of landing in low-illumination conditions by employing various active light sources to lighten the markers. In this paper, a novel vision system design is proposed to tackle UAV landing in outdoor extreme low-illumination environments without the need to apply an active light source to the marker. We use a model-based enhancement scheme to improve the quality and brightness of the onboard captured images, then present a hierarchical-based method consisting of a decision tree with an associated light-weight convolutional neural network (CNN) for coarse-to-fine landing marker localization, where the key information of the marker is extracted and reserved for post-processing, such as pose estimation and landing control. Extensive evaluations have been conducted to demonstrate the robustness, accuracy, and real-time performance of the proposed vision system. Field experiments across a variety of outdoor nighttime scenarios with an average luminance of 5 lx at the marker locations have proven the feasibility and practicability of the system.
引用
收藏
页数:25
相关论文
共 50 条
  • [31] Real-Time Monocular Depth Estimation Merging Vision Transformers on Edge Devices for AIoT
    Liu, Xihao
    Wei, Wei
    Liu, Cheng
    Peng, Yuyang
    Huang, Jinhao
    Li, Jun
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [32] Learning Target-Aware Vision Transformers for Real-Time UAV Tracking
    Li, Shuiwang
    Yang, Xiangyang
    Wang, Xucheng
    Zeng, Dan
    Ye, Hengzhou
    Zhao, Qijun
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [33] Real-time 3D motion capture by monocular vision and virtual rendering
    Jauregui, David Antonio Gomez
    Horain, Patrick
    MACHINE VISION AND APPLICATIONS, 2017, 28 (08) : 839 - 858
  • [34] A real-time navigation system for autonomous underwater vehicle
    Zanoni, Fabio Doro
    de Barros, Ettore Apolonio
    JOURNAL OF THE BRAZILIAN SOCIETY OF MECHANICAL SCIENCES AND ENGINEERING, 2015, 37 (04) : 1111 - 1127
  • [35] A real-time navigation system for autonomous underwater vehicle
    Fábio Doro Zanoni
    Ettore Apolônio de Barros
    Journal of the Brazilian Society of Mechanical Sciences and Engineering, 2015, 37 : 1111 - 1127
  • [36] Real-time image fusion system for low-light color night vision
    Wang, SX
    Gao, ZY
    Jin, WQ
    Hou, SF
    ELECTRONIC IMAGING AND MULTIMEDIA TECHNOLOGY III, 2002, 4925 : 534 - 538
  • [37] A real-time semantic segmentation method for end-to-end autonomous driving in low-light environments
    Liu, Yang
    Yi, Fulong
    Ma, Yuhua
    Wang, Yongfu
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2023, 45 (06) : 9223 - 9237
  • [38] Map2DFusion: Real-time Incremental UAV Image Mosaicing based on Monocular SLAM
    Bu, Shuhui
    Zhao, Yong
    Wan, Gang
    Liu, Zhenbao
    2016 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2016), 2016, : 4564 - 4571
  • [39] CNN-Based Dense Monocular Visual SLAM for Real-Time UAV Exploration in Emergency Conditions
    Steenbeek, Anne
    Nex, Francesco
    DRONES, 2022, 6 (03)
  • [40] A MODULAR AND LOW-COST PORTABLE VSLAM SYSTEM FOR REAL-TIME 3D MAPPING: FROM INDOOR AND OUTDOOR SPACES TO UNDERWATER ENVIRONMENTS
    Menna, F.
    Torresani, A.
    Battisti, R.
    Nocerino, E.
    Remondino, F.
    7TH INTERNATIONAL WORKSHOP LOWCOST 3D - SENSORS, ALGORITHMS, APPLICATIONS, 2022, 48-2 (W1): : 153 - 162