Automatic attitude measurement of laser tracker based on deep learning and PnP model

被引:0
作者
Zhou D. [1 ,2 ]
Gao D. [1 ]
Dong D. [1 ,2 ]
Zhou W. [1 ,2 ]
Cui C. [1 ]
机构
[1] Institute of Microelectronics of the Chinese Academy of Sciences, Beijing
[2] University of Chinese Academy of Sciences, Beijing
来源
Guangxue Jingmi Gongcheng/Optics and Precision Engineering | 2022年 / 30卷 / 09期
关键词
Attitude measurement; Deep learning; Laser tracker; Monocular vision;
D O I
10.37188/OPE.20223009.1047
中图分类号
学科分类号
摘要
In view of the urgent demand for attitude measurement in high-end manufacturing applications, such as aerospace and automobile assembly, a fast and high-precision attitude measurement method for a laser tracker was proposed. The method employed deep learning in conjunction with the visual PnP model to realize automatic attitude measurement of the laser tracker. The correspondence between 3D feature points and 2D feature points required by the traditional PnP model were directly determined through a feature extraction network designed to extract high-dimensional features. The joint probability distribution between feature vectors was determined using optimal transmission theory to complete the matching of 3D-2D feature points. Subsequently, Ransac-P3P combined with EPnP algorithm was used to obtain high-precision attitude information; Based on this, the Jacobian matrix of PnP solution process was calculated using implicit differential theory, and the PnP attitude solution model was integrated into the network to guide the training of the network. The complementary advantages of strong depth network matching ability and high attitude solution accuracy of the PnP model improved the solution accuracy of the network. In addition, a dataset with rich annotation information was used to train the attitude measurement network for the laser tracker. Finally, an attitude measurement test was conducted using a high-precision two-dimensional turntable. The experimental results show that the calculation error of pitch angle is less than 0.31°, the rolling angle error is less than 0.03°, and the single measurement takes approximately 40 ms. The proposed method can potentially be applied to attitude measurement scene of the laser tracker. © 2022, Science Press. All right reserved.
引用
收藏
页码:1047 / 1057
页数:10
相关论文
共 20 条
  • [1] ZHU Y G, ZHANG W B, DENG ZH P, Et al., Dynamic synthesis correction of deviation for aircraft wing-fuselage docking assembly based on laser tracker and machine vision, Journal of Mechanical Engineering, 55, 24, pp. 187-196, (2019)
  • [2] YANG W H, LIN J R, GAO Y, Et al., Modeling and error analysis of laser target pose measurement system, Nanotechnology and Precision Engineering, 13, 4, pp. 293-298, (2015)
  • [3] CHI SH K, YE X, GAO X, Et al., Coded marker-based high-accuracy motion estimation, Opt. Precision Eng, 29, 7, pp. 1720-1730, (2021)
  • [4] WU D, ZHUANG Z Y, XIANG C Q, Et al., 6D-VNet: end-to-end 6DoF vehicle pose estimation from monocular RGB images, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 1238-1247, (2019)
  • [5] KEHL W, MANHARDT F, TOMBARI F, Et al., SSD-6D: making RGB-based 3D detection and 6D pose estimation great again[C], 2017 IEEE International Conference on Computer Vision, 2229, pp. 1530-1538, (2017)
  • [6] XU L ZH, FU Q W, TAO W, Et al., Monocular vehicle pose estimation based on 3D model, Opt. Precision Eng, 29, 6, pp. 1346-1355, (2021)
  • [7] SATTLER T, ZHOU Q J, POLLEFEYS M, Et al., Understanding the limitations of CNN-based absolute camera pose regression, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 1520, pp. 3297-3307, (2019)
  • [8] QU Y P, HOU W., Attitude accuracy analysis of PnP based on error propagation theory, Opt. Precision Eng, 27, 2, pp. 479-487, (2019)
  • [9] PENG S D, LIU Y, HUANG Q X, Et al., PVNet: pixel-wise voting network for 6DoF pose estimation, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 1520, pp. 4556-4565, (2019)
  • [10] CHEN B, CAO J W, PARRA A, Et al., Satellite pose estimation with deep landmark regression and nonlinear pose refinement, 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW), 2728, pp. 2816-2824, (2019)