Vision-based pose estimation of a multi-rotor unmanned aerial vehicle

被引:1
作者
Gupta, Kashish [1 ]
Emran, Bara Jamal [1 ]
Najjaran, Homayoun [1 ]
机构
[1] Univ British Columbia, Sch Engn, Kelowna, BC, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Unmanned aerial vehicles; Computer vision; 3-D pose estimation; Autonomous landing; SYSTEM;
D O I
10.1108/IJIUS-10-2018-0030
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Purpose The purpose of this paper is to facilitate autonomous landing of a multi-rotor unmanned aerial vehicle (UAV) on a moving/tilting platform using a robust vision-based approach. Design/methodology/approach Autonomous landing of a multi-rotor UAV on a moving or tilting platform of unknown orientation in a GPS-denied and vision-compromised environment presents a challenge to common autopilot systems. The paper proposes a robust visual data processing system based on targets' Oriented FAST and Rotated BRIEF features to estimate the UAV's three-dimensional pose in real time. Findings The system is able to visually locate and identify the unique landing platform based on a cooperative marker with an error rate of 1 degrees or less for all roll, pitch and yaw angles. Originality/value The simplicity of the training procedure gives the process the flexibility needed to use a marker of any unknown/irregular shape or dimension. The process can be easily tweaked to respond to different cooperative markers. The on-board computationally inexpensive process can be added to off-the-shelf autopilots.
引用
收藏
页码:120 / 132
页数:13
相关论文
共 30 条
[1]  
An SY, 2016, 2016 IEEE CHINESE GUIDANCE, NAVIGATION AND CONTROL CONFERENCE (CGNCC), P44, DOI 10.1109/CGNCC.2016.7828756
[2]  
[Anonymous], 2016, STUD COMPUT INTELL, DOI DOI 10.1007/978-3-319-28854-3
[3]   Vision Based Autonomous Landing of Multirotor UAV on Moving Platform [J].
Araar, Oualid ;
Aouf, Nabil ;
Vitanov, Ivan .
JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2017, 85 (02) :369-384
[4]  
Asif U, 2013, C IND ELECT APPL, P1692
[5]  
Askelson M., 2017, Small UAS Detect and Avoid Requirements Necessary for Limited Beyond Visual Line of Sight (BVLOS) Operations
[6]  
Benini A, 2016, IEEE INT CONF ROBOT, P3463, DOI 10.1109/ICRA.2016.7487525
[7]  
Borshchova I, 2017, INT J INTELL UNMANNE, V5, P28, DOI 10.1108/IJIUS-10-2016-0008
[8]   Local shape feature fusion for improved matching, pose estimation and 3D object recognition [J].
Buch, Anders G. ;
Petersen, Henrik G. ;
Kruger, Norbert .
SPRINGERPLUS, 2016, 5
[9]   Embedded Low Power Controller for Autonomous Landing of Small UAVs using Neural Network [J].
Din, Ahmad ;
Bona, Basilio ;
Morrissette, Joel ;
Hussain, Moazzam ;
Violante, Massimo ;
Naseem, Fawad .
10TH INTERNATIONAL CONFERENCE ON FRONTIERS OF INFORMATION TECHNOLOGY (FIT 2012), 2012, :196-203
[10]   Development of Small UAS Beyond-Visual-Line-of-Sight (BVLOS) Flight Operations: System Requirements and Procedures [J].
Fang, Scott ;
O'Young, Siu ;
Rolland, Luc .
DRONES, 2018, 2 (02) :1-17