MLP-Based Efficient Stitching Method for UAV Images

被引:7
作者
Ren, Moxuan [1 ]
Li, Jianan [1 ]
Song, Liqiang [2 ]
Li, Hui [2 ]
Xu, Tingfa [1 ,3 ]
机构
[1] Beijing Inst Technol, Beijing 100081, Peoples R China
[2] Chinese Acad Sci, Natl Astron Observ, Beijing 100101, Peoples R China
[3] Beijing Inst Technol, Chongqing Innovat Ctr, Chongqing 401147, Peoples R China
关键词
Autonomous aerial vehicles; Image stitching; Training; Real-time systems; Image registration; Cameras; Annotations; Aerial image; image registration; multi-layer perceptron (MLP); position; and attitude; MOSAICKING;
D O I
10.1109/LGRS.2022.3141890
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Unmanned aerial vehicle (UAV) image stitching techniques based on position and attitude information have shown clear speed superiority over feature-based counterparts. However, how to improve stitching accuracy and robustness remains a great challenge since position and attitude parameters are sensitive to noise introduced by sensors and external environment. To mitigate this issue, this work presents a simple yet effective stitching algorithm for UAV images based on a coarse-to-fine strategy. Specifically, we first conduct coarse registration using the position and attitude information obtained from GPS, IMU, and altimeter. Then, we introduce a novel offline calibration phase that is designed to regress the obtained global transformation matrix to the optimal one computed from feature-based algorithms, by using multi-layer perceptron (MLP) neural networks for fast correction. Consequently, the proposed method well integrates the complementary strengths of both parameter and feature-based methods, achieving an ideal speed-accuracy tradeoff. Moreover, to facilitate research on this topic, we establish a new dataset, named UAV-AIRPAI, that comprises over 100 UAV image pairs with position and attitude annotations to the community, opening up a promising direction for UAV image stitching. Extensive experiments on the UAV-AIRPAI dataset show that our method achieves superior accuracy compared to priors while running at a real-time speed of 0.0124 s per image pair. Code and data will be available at https://github.com/dededust/UAV-AIRPAI.
引用
收藏
页数:5
相关论文
共 21 条
  • [1] Automatic panoramic image stitching using invariant features
    Brown, Matthew
    Lowe, David G.
    [J]. INTERNATIONAL JOURNAL OF COMPUTER VISION, 2007, 74 (01) : 59 - 73
  • [2] Unmanned Aerial Vehicle Localization Based on Monocular Vision and Online Mosaicking
    Caballero, Fernando
    Merino, Luis
    Ferruz, Joaquin
    Ollero, Anibal
    [J]. JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2009, 55 (4-5) : 323 - 343
  • [3] [程争刚 Cheng Zhenggang], 2016, [测绘学报, Acta Geodetica et Cartographica Sinica], V45, P698
  • [4] DeTone D., 2016, DEEP IMAGE HOMOGRAPH
  • [5] RANDOM SAMPLE CONSENSUS - A PARADIGM FOR MODEL-FITTING WITH APPLICATIONS TO IMAGE-ANALYSIS AND AUTOMATED CARTOGRAPHY
    FISCHLER, MA
    BOLLES, RC
    [J]. COMMUNICATIONS OF THE ACM, 1981, 24 (06) : 381 - 395
  • [6] Robust Stepwise Correspondence Refinement for Low-Altitude Remote Sensing Image Registration
    Gong, Xiaoying
    Liu, Yuyan
    Yang, Yang
    [J]. IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2021, 18 (10) : 1736 - 1740
  • [7] Robust Mosaicking of Lightweight UAV Images Using Hybrid Image Transformation Modeling
    Kim, Jae-In
    Kim, Hyun-cheol
    Kim, Taejung
    [J]. REMOTE SENSING, 2020, 12 (06)
  • [8] Fast and robust geometric correction for mosaicking UAV images with narrow overlaps
    Kim, Jae-In
    Kim, Taejung
    Shin, Daesik
    Kim, SangHee
    [J]. INTERNATIONAL JOURNAL OF REMOTE SENSING, 2017, 38 (8-10) : 2557 - 2576
  • [9] Edge-Enhanced Optimal Seamline Detection for Orthoimage Mosaicking
    Li, Li
    Yao, Jian
    Xie, Renping
    Li, Jie
    [J]. IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2018, 15 (05) : 764 - 768
  • [10] Ground Feature Oriented Path Planning for Unmanned Aerial Vehicle Mapping
    Liu, Chun
    Zhang, Shuhang
    Akbar, Akram
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2019, 12 (04) : 1175 - 1187