YoloTag: Vision-based Robust UAV Navigation with Fiducial Markers

被引:0
|
作者
Raxit, Sourav [1 ]
Singh, Simant Bahadur [1 ]
Newaz, Abdullah Al Redwan [1 ]
机构
[1] Univ New Orleans, Dept Comp Sci, New Orleans, LA 70148 USA
关键词
VERSATILE;
D O I
10.1109/RO-MAN60168.2024.10731319
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
By harnessing fiducial markers as visual landmarks in the environment, Unmanned Aerial Vehicles (UAVs) can rapidly build precise maps and navigate spaces safely and efficiently, unlocking their potential for fluent collaboration and coexistence with humans. Existing fiducial marker methods rely on handcrafted feature extraction, which sacrifices accuracy. On the other hand, deep learning pipelines for marker detection fail to meet real-time runtime constraints crucial for navigation applications. In this work, we propose YoloTag -a realtime fiducial marker-based localization system. YoloTag uses a lightweight YOLO v8 object detector to accurately detect fiducial markers in images while meeting the runtime constraints needed for navigation. The detected markers are then used by an efficient perspective-n-point algorithm to estimate UAV states. However, this localization system introduces noise, causing instability in trajectory tracking. To suppress noise, we design a higher-order Butterworth filter that effectively eliminates noise through frequency domain analysis. We evaluate our algorithm through real-robot experiments in an indoor environment, comparing the trajectory tracking performance of our method against other approaches in terms of several distance metrics.
引用
收藏
页码:311 / 316
页数:6
相关论文
共 50 条
  • [41] Airborne Vision-Based Navigation Method for UAV Accuracy Landing Using Infrared Lamps
    Yang Gui
    Pengyu Guo
    Hongliang Zhang
    Zhihui Lei
    Xiang Zhou
    Jing Du
    Qifeng Yu
    Journal of Intelligent & Robotic Systems, 2013, 72 : 197 - 218
  • [42] Airborne Vision-Based Navigation Method for UAV Accuracy Landing Using Infrared Lamps
    Gui, Yang
    Guo, Pengyu
    Zhang, Hongliang
    Lei, Zhihui
    Zhou, Xiang
    Du, Jing
    Yu, Qifeng
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2013, 72 (02) : 197 - 218
  • [43] Vision-based Navigation of UAV with Continuous Action Space Using Deep Reinforcement Learning
    Zhou, Benchun
    Wang, Weihong
    Liu, Zhenghua
    Wang, Jia
    PROCEEDINGS OF THE 2019 31ST CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2019), 2019, : 5030 - 5035
  • [44] Research on Vision-Based Navigation for Plant Protection UAV under the Near Color Background
    Zhang, Hehu
    Wang, Xiushan
    Chen, Ying
    Jiang, Guoqiang
    Lin, Shifeng
    SYMMETRY-BASEL, 2019, 11 (04):
  • [45] Robust Vision-based Pose Estimation for Relative Navigation of Unmanned Aerial Vehicles
    Park, Jang-Seong
    Lee, Dongjin
    Jeon, Byoungil
    Bang, Hyochoong
    2013 13TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS 2013), 2013, : 386 - 390
  • [46] Landmark selection for vision-based navigation
    Sala, P
    Sim, R
    Shokoufandeh, A
    Dickinson, S
    IEEE TRANSACTIONS ON ROBOTICS, 2006, 22 (02) : 334 - 349
  • [47] Motion and structure for vision-based navigation
    Sagüés, C
    Guerrero, JJ
    ROBOTICA, 1999, 17 : 355 - 364
  • [48] Vision-Based Collaborative Navigation for UAV-UGV-Dismounted Units in GPS Challenged Environments
    Moafipoor, Shahram
    Bock, Lydia
    Fayman, Jeffrey A.
    Conroy, Eoin
    PROCEEDINGS OF THE 33RD INTERNATIONAL TECHNICAL MEETING OF THE SATELLITE DIVISION OF THE INSTITUTE OF NAVIGATION (ION GNSS+ 2020), 2020, : 573 - 584
  • [49] Vision-based UAV Landing on the Moving Vehicle
    Lee, Hanscob
    Jung, Scokwoo
    Shim, David Hyunchul
    2016 INTERNATIONAL CONFERENCE ON UNMANNED AIRCRAFT SYSTEMS (ICUAS), 2016, : 1 - 7
  • [50] Vision-Based Autonomous Landing for the UAV: A Review
    Xin, Long
    Tang, Zimu
    Gai, Weiqi
    Liu, Haobo
    AEROSPACE, 2022, 9 (11)