Vision-Based Spacecraft Pose Estimation via a Deep Convolutional Neural Network for Noncooperative Docking Operations

被引:45
作者
Phisannupawong, Thaweerath [1 ,2 ]
Kamsing, Patcharin [1 ]
Torteeka, Peerapong [3 ]
Channumsin, Sittiporn [4 ]
Sawangwit, Utane [3 ]
Hematulin, Warunyu [1 ]
Jarawan, Tanatthep [1 ]
Somjit, Thanaporn [1 ]
Yooyen, Soemsak [1 ]
Delahaye, Daniel [5 ]
Boonsrimuang, Pisit [6 ]
机构
[1] King Mongkuts Inst Technol Ladkrabang, Int Acad Aviat Ind, Dept Aeronaut Engn, Air Space Control Optimizat & Management Lab, Bangkok 10520, Thailand
[2] Natl Astron Res Inst Thailand, Internship Program, Chiang Mai 50180, Thailand
[3] Natl Astron Res Inst Thailand, Res Grp, Chiang Mai 50180, Thailand
[4] Geoinformat & Space Technol Dev Agcy GISTDA, Astrodynam Res Lab, Chon Buri 20230, Thailand
[5] Ecole Natl Aviat Civile, F-31400 Toulouse, France
[6] King Mongkuts Inst Technol Ladkrabang, Fac Engn, Bangkok 10520, Thailand
关键词
spacecraft docking operation; on-orbit services; pose estimation; deep convolutional neural network;
D O I
10.3390/aerospace7090126
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
The capture of a target spacecraft by a chaser is an on-orbit docking operation that requires an accurate, reliable, and robust object recognition algorithm. Vision-based guided spacecraft relative motion during close-proximity maneuvers has been consecutively applied using dynamic modeling as a spacecraft on-orbit service system. This research constructs a vision-based pose estimation model that performs image processing via a deep convolutional neural network. The pose estimation model was constructed by repurposing a modified pretrained GoogLeNet model with the available Unreal Engine 4 rendered dataset of the Soyuz spacecraft. In the implementation, the convolutional neural network learns from the data samples to create correlations between the images and the spacecraft's six degrees-of-freedom parameters. The experiment has compared an exponential-based loss function and a weighted Euclidean-based loss function. Using the weighted Euclidean-based loss function, the implemented pose estimation model achieved moderately high performance with a position accuracy of 92.53 percent and an error of 1.2 m. The in-attitude prediction accuracy can reach 87.93 percent, and the errors in the three Euler angles do not exceed 7.6 degrees. This research can contribute to spacecraft detection and tracking problems. Although the finished vision-based model is specific to the environment of synthetic dataset, the model could be trained further to address actual docking operations in the future.
引用
收藏
页码:1 / 22
页数:22
相关论文
共 27 条
  • [11] Kuipers J., 1999, Quaternions and Rotation Sequences: A Primer With Applications To Orbits, Aerospace and Virtual Reality
  • [12] Spherical Regression: Learning Viewpoints, Surface Normals and 3D Rotations on n-Spheres
    Liao, Shuai
    Gavves, Efstratios
    Snoek, Cees G. M.
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 9751 - 9759
  • [13] Relative Pose Estimation for Cylinder-Shaped Spacecrafts Using Single Image
    Liu, Chang
    Hu, Weiduo
    [J]. IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2014, 50 (04) : 3036 - 3056
  • [14] Optimal control for spacecraft to rendezvous with a tumbling satellite in a close range
    Ma, Zhanhua
    Ma, Ou
    Shashikanth, Banavara N.
    [J]. 2006 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-12, 2006, : 4109 - +
  • [15] 3D Pose Regression using Convolutional Neural Networks
    Mahendran, Siddharth
    Ali, Haider
    Vidal, Rene
    [J]. 2017 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW), 2017, : 494 - 495
  • [16] Mahendran Siddharth, 2018, ARXIV180503225
  • [17] Maheshwari A., 2018, P 2018 AV TECHN INT
  • [18] Phisannupawong T, 2020, INT CONF ADV COMMUN, P280, DOI [10.23919/ICACT48636.2020.9061445, 10.23919/icact48636.2020.9061445]
  • [19] Proenca P.F., ARXIV190704298
  • [20] Rengasamy D, 2018, IEEE INT C INTELL TR, P150, DOI 10.1109/ITSC.2018.8569502