Relative target estimation using a cascade of extended Kalman filters

被引:0
作者
Nielsen, Jerel [1 ]
Beard, Randal W. [2 ]
机构
[1] Brigham Young Univ, Multiple AGent Intelligent Coordinat & Control MA, Provo, UT 84602 USA
[2] Brigham Young Univ, Elect Engn Dept, Provo, UT 84602 USA
来源
PROCEEDINGS OF THE 30TH INTERNATIONAL TECHNICAL MEETING OF THE SATELLITE DIVISION OF THE INSTITUTE OF NAVIGATION (ION GNSS+ 2017) | 2017年
基金
美国国家科学基金会;
关键词
RECURSIVE-RANSAC; TRACKING;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This paper presents a method of tracking multiple ground targets from an unmanned aerial vehicle ( UAV) in a 3D reference frame. The tracking method uses a monocular camera and makes no assumptions on the shape of the terrain or the target motion. The UAV runs two cascaded estimators. The first is an Extended Kalman Filter ( EKF), which is responsible for tracking the UAV's state, such as position and velocity relative to a fixed frame. The second estimator is an EKF that is responsible for estimating a fixed number of landmarks within the camera's field of view. Landmarks are parameterized by a quaternion associated with bearing from the camera's optical axis and an inverse distance parameter. The bearing quaternion allows for a minimal representation of each landmark's direction and distance, a filter with no singularities, and a fast update rate due to few trigonometric functions. Three methods for estimating the ground target positions are demonstrated: the first uses the landmark estimator directly on the targets, the second computes the target depth with a weighted average of converged landmark depths, and the third extends the target's measured bearing vector to intersect a ground plane approximated from the landmark estimates. Simulation results show that the third target estimation method yields the most accurate results.
引用
收藏
页码:2273 / 2289
页数:17
相关论文
共 18 条
[1]  
[Anonymous], 2014, THESIS B YOUNG U PRO
[2]  
Beard R. W., 2012, SMALL UNMANNED AIRCR
[3]  
Bloesch M, 2015, IEEE INT C INT ROBOT, P298, DOI 10.1109/IROS.2015.7353389
[4]  
Bloesch Michael Andre, 2017, INERTIAL SENSING COM
[5]  
Escamilla-Ambrosio PJ, 2004, 2004 IEEE INTELLIGENT VEHICLES SYMPOSIUM, P601
[6]  
Fortmann T. E., 1980, Proceedings of the 19th IEEE Conference on Decision & Control Including the Symposium on Adaptive Processes, P807
[7]   Integrating generic sensor fusion algorithms with sound state representations through encapsulation of manifolds [J].
Hertzberg, Christoph ;
Wagner, Rene ;
Frese, Udo ;
Schroeder, Lutz .
INFORMATION FUSION, 2013, 14 (01) :57-77
[8]  
Hwang I, 2004, P AMER CONTR CONF, P3422
[9]  
Ingersoll K, 2015, INT CONF UNMAN AIRCR, P1320, DOI 10.1109/ICUAS.2015.7152426
[10]  
JONES BA, 2015, INFORMATION FUSION F, P1278