MODEEP: a motion-based object detection and pose estimation method for airborne FLIR sequences

被引:0
作者
Alexander Strehl
J.K. Aggarwal
机构
[1] Computer and Vision Research Center,
[2] The University of Texas at Austin,undefined
[3] Department of Electrical and Computer Engineering,undefined
[4] Austin,undefined
[5] TX 78712-1084,undefined
[6] USA; e-mail: {strehl,undefined
[7] aggarwaljk}@mail.utexas.edu,undefined
来源
Machine Vision and Applications | 2000年 / 11卷
关键词
Key words: Motion detection – Object segmentation – Pose estimation – Moving camera – Affine image registration – Infrared – Bayes;
D O I
暂无
中图分类号
学科分类号
摘要
In this paper, we present a method called MODEEP (Motion-based Object DEtection and Estimation of Pose) to detect independently moving objects (IMOs) in forward-looking infrared (FLIR) image sequences taken from an airborne, moving platform. Ego-motion effects are removed through a robust multi-scale affine image registration process. Thereafter, areas with residual motion indicate potential object activity. These areas are detected, refined and selected using a Bayesian classifier. The resulting regions are clustered into pairs such that each pair represents one object's front and rear end. Using motion and scene knowledge, we estimate object pose and establish a region of interest (ROI) for each pair. Edge elements within each ROI are used to segment the convex cover containing the IMO. We show detailed results on real, complex, cluttered and noisy sequences. Moreover, we outline the integration of our fast and robust system into a comprehensive automatic target recognition (ATR) and action classification system.
引用
收藏
页码:267 / 276
页数:9
相关论文
empty
未找到相关数据