YoloTag: Vision-based Robust UAV Navigation with Fiducial Markers

被引:0
|
作者
Raxit, Sourav [1 ]
Singh, Simant Bahadur [1 ]
Newaz, Abdullah Al Redwan [1 ]
机构
[1] Univ New Orleans, Dept Comp Sci, New Orleans, LA 70148 USA
关键词
VERSATILE;
D O I
10.1109/RO-MAN60168.2024.10731319
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
By harnessing fiducial markers as visual landmarks in the environment, Unmanned Aerial Vehicles (UAVs) can rapidly build precise maps and navigate spaces safely and efficiently, unlocking their potential for fluent collaboration and coexistence with humans. Existing fiducial marker methods rely on handcrafted feature extraction, which sacrifices accuracy. On the other hand, deep learning pipelines for marker detection fail to meet real-time runtime constraints crucial for navigation applications. In this work, we propose YoloTag -a realtime fiducial marker-based localization system. YoloTag uses a lightweight YOLO v8 object detector to accurately detect fiducial markers in images while meeting the runtime constraints needed for navigation. The detected markers are then used by an efficient perspective-n-point algorithm to estimate UAV states. However, this localization system introduces noise, causing instability in trajectory tracking. To suppress noise, we design a higher-order Butterworth filter that effectively eliminates noise through frequency domain analysis. We evaluate our algorithm through real-robot experiments in an indoor environment, comparing the trajectory tracking performance of our method against other approaches in terms of several distance metrics.
引用
收藏
页码:311 / 316
页数:6
相关论文
共 50 条
  • [21] Robust Vision-based Autonomous Navigation against Environment Changes
    Kim, Jungho
    Bok, Yunsu
    Kweon, In So
    2008 IEEE/RSJ INTERNATIONAL CONFERENCE ON ROBOTS AND INTELLIGENT SYSTEMS, VOLS 1-3, CONFERENCE PROCEEDINGS, 2008, : 696 - 701
  • [22] Robust Extraction of Shady Roads for Vision-based UGV Navigation
    Dong-SI, Tue-Cuong
    Guo, Dong
    Yan, Chye Hwang
    Ong, Sim Heng
    2008 IEEE/RSJ INTERNATIONAL CONFERENCE ON ROBOTS AND INTELLIGENT SYSTEMS, VOLS 1-3, CONFERENCE PROCEEDINGS, 2008, : 3140 - +
  • [23] A Survey on Vision-based Navigation Systems Robust to Illumination Changes
    Jang, Youngseok
    Kim, Changhyeon
    Kim, H. Jin
    2022 INTERNATIONAL CONFERENCE ON ELECTRONICS, INFORMATION, AND COMMUNICATION (ICEIC), 2022,
  • [24] Robust Vision-Based Control of a Rotorcraft UAV for Uncooperative Target Tracking
    Zhang, Shijie
    Zhao, Xiangtian
    Zhou, Botian
    SENSORS, 2020, 20 (12) : 1 - 23
  • [25] Vision-based positioning of Unmanned Surface Vehicles using Fiducial Markers for automatic docking
    Digerud, Lars
    Volden, Oystein
    Christensen, Kim A.
    Kohtala, Sampsa
    Steinert, Martin
    IFAC PAPERSONLINE, 2022, 55 (31): : 78 - 84
  • [26] Vision-Based Aircraft Pose Estimation for UAVs Autonomous Inspection without Fiducial Markers
    Cazzato, Dario
    Olivares-Mendez, Miguel A.
    Sanchez-Lopez, Jose Luis
    Voos, Holger
    45TH ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY (IECON 2019), 2019, : 5642 - 5648
  • [27] Efficient vision-based navigation
    Hornung, Armin
    Bennewitz, Maren
    Strasdat, Hauke
    AUTONOMOUS ROBOTS, 2010, 29 (02) : 137 - 149
  • [28] A Robust Crater Matching Algorithm for Autonomous Vision-Based Spacecraft Navigation
    Roberto, Del Prete
    Alfredo, Renga
    2021 IEEE 8TH INTERNATIONAL WORKSHOP ON METROLOGY FOR AEROSPACE (IEEE METROAEROSPACE), 2021, : 322 - 327
  • [29] Robust Vision-Based Autonomous Navigation, Mapping and Landing for MAVs at Night
    Daftry, Shreyansh
    Das, Manash
    Delaune, Jeff
    Sorice, Cristina
    Hewitt, Robert
    Reddy, Shreetej
    Lytle, Daniel
    Gu, Elvin
    Matthies, Larry
    PROCEEDINGS OF THE 2018 INTERNATIONAL SYMPOSIUM ON EXPERIMENTAL ROBOTICS, 2020, 11 : 232 - 242
  • [30] A software platform for vision-based UAV autonomous landing guidance based on markers estimation
    XiaoBin Xu
    Zhao Wang
    YiMin Deng
    Science China Technological Sciences, 2019, 62 : 1825 - 1836