A pose estimation method of space non-cooperative target based on ORBFPFH SLAM

被引:4
作者
Yan, Zhiqiang [1 ,2 ]
Wang, Hongyuan [2 ]
Ze, Liuchuanjiang [2 ]
Ning, Qianhao [2 ]
Lu, Yinxi [2 ]
机构
[1] Harbin Inst Technol, Elect Sci & Technol Postdoctoral Res Stn, Harbin 150001, Peoples R China
[2] Harbin Inst Technol, Space Opt Engn Res Ctr, Harbin 150001, Peoples R China
来源
OPTIK | 2023年 / 286卷
基金
中国国家自然科学基金;
关键词
Space non -cooperative target; ORBFPFH; SLAM; Pose estimation; Pose graph optimization; BoW; OPTIMIZATION; SYSTEM;
D O I
10.1016/j.ijleo.2023.171025
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
In this paper, to improve the pose measurement accuracy of the Time-of-Flight (ToF) camera for space non-cooperative targets, a pose estimation method based on ORBFPFH Simultaneous Localization and Mapping (SLAM) with effective integration of strength and depth measurement information is proposed. The primary process of the method is as follows: First, a unique ORBFPFH Bag of Words (BoW) model for space targets is trained based on the ToF image dataset of spacecraft; Secondly, the pose of the non-cooperative space target is tracked based on the ORBFPFH feature and optimized using the pose graph; Then, based on the ORBFPFH BoW model, loop closure is detected to reduce the cumulative error. Finally, the proposed method is tested on the ToF image dataset: compared with the advanced ORB-SLAM2 algorithm, the proposed method has a minor translation error and rotation error on the test data, with a mean translation error less than 0.144 m and mean rotation error less than 0.642 degrees. The test results show that the proposed method can improve the pose estimation accuracy of space non-cooperative targets, achieve a better 3D point cloud reconstruction effect, and provide technical support for space applications such as rendezvous and docking of non-cooperative spacecraft. & COPY; 2001 Elsevier Science. All rights reserved
引用
收藏
页数:16
相关论文
共 43 条
  • [1] Augenstein S., 2011, 2011 IEEE International Conference on Robotics and Automation (ICRA 2011), P3131, DOI 10.1109/ICRA.2011.5980232
  • [2] Monocular-based pose determination of uncooperative space objects
    Capuano, Vincenzo
    Kim, Kyunam
    Harvard, Alexei
    Chung, Soon-Jo
    [J]. ACTA ASTRONAUTICA, 2020, 166 (493-506) : 493 - 506
  • [3] Review of the robustness and applicability of monocular pose estimation systems for relative navigation with an uncooperative spacecraft
    Cassinis, Lorenzo Pasqualetto
    Fonod, Robert
    Gill, Eberhard
    [J]. PROGRESS IN AEROSPACE SCIENCES, 2019, 110
  • [4] Castaneda V., 2011, 2011 IEEE WORKSHOP A, P672
  • [5] Global Descriptors for Visual Pose Estimation of a Noncooperative Target in Space Rendezvous
    Comellini, Anthea
    Le Le Ny, Jerome
    Zenou, Emmanuel
    Espinosa, Christine
    Dubanchet, Vincent
    [J]. IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2021, 57 (06) : 4197 - 4212
  • [6] Research on pose estimation for stereo vision measurement system by an improved method: uncertainty weighted stereopsis pose solution method based on projection vector
    Cui, Jiashan
    Min, Changwan
    Feng, Dongzhu
    [J]. OPTICS EXPRESS, 2020, 28 (04) : 5470 - 5491
  • [7] Proximity operations about and identification of non-cooperative resident space objects using stereo imaging
    Davis, Jill
    Pernicka, Henry
    [J]. ACTA ASTRONAUTICA, 2019, 155 : 418 - 425
  • [8] Experiment for pose estimation of uncooperative space debris using stereo vision
    De Jongh, W. C.
    Jordaan, H. W.
    Van Daalen, C. E.
    [J]. ACTA ASTRONAUTICA, 2020, 168 : 164 - 173
  • [9] Bags of Binary Words for Fast Place Recognition in Image Sequences
    Galvez-Lopez, Dorian
    Tardos, Juan D.
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2012, 28 (05) : 1188 - 1197
  • [10] Scale-unambiguous relative pose estimation of space uncooperative targets based on the fusion of three-dimensional time-of-flight camera and monocular camera
    Hao, Gangtao
    Du, Xiaoping
    Chen, Hang
    Song, Jianjun
    Gao, Tengfei
    [J]. OPTICAL ENGINEERING, 2015, 54 (05)