Extrinsic Calibration between Camera and LiDAR Sensors by Matching Multiple 3D Planes

被引:58
作者
Kim, Eung-su [1 ]
Park, Soon-Yong [2 ]
机构
[1] Kyungpook Natl Univ, Sch Comp Sci & Engn, Daegu 41566, South Korea
[2] Kyungpook Natl Univ, Sch Elect Engn, Daegu 41566, South Korea
基金
新加坡国家研究基金会;
关键词
camera; LiDAR; calibration; plane matching; ICP; projection;
D O I
10.3390/s20010052
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
This paper proposes a simple extrinsic calibration method for a multi-sensor system which consists of six image cameras and a 16-channel 3D LiDAR sensor using a planar chessboard. The six cameras are mounted on a specially designed hexagonal plate to capture omnidirectional images and the LiDAR sensor is mounted on the top of the plates to capture 3D points in 360 degrees. Considering each camera-LiDAR combination as an independent multi-sensor unit, the rotation and translation between the two sensor coordinates are calibrated. The 2D chessboard corners in the camera image are reprojected into 3D space to fit to a 3D plane with respect to the camera coordinate system. The corresponding 3D point data that scan the chessboard are used to fit to another 3D plane with respect to the LiDAR coordinate system. The rotation matrix is calculated by aligning normal vectors of the corresponding planes. In addition, an arbitrary point on the 3D camera plane is projected to a 3D point on the LiDAR plane, and the distance between the two points are iteratively minimized to estimate the translation matrix. At least three or more planes are used to find accurate external parameters between the coordinate systems. Finally, the estimated transformation is refined using the distance between all chessboard 3D points and the LiDAR plane. In the experiments, quantitative error analysis is done using a simulation tool and real test sequences are also used for calibration consistency analysis.
引用
收藏
页数:17
相关论文
共 21 条
[11]  
Kim ES, 2019, INT CONF UBIQ FUTUR, P89, DOI [10.1109/icufn.2019.8806057, 10.1109/ICUFN.2019.8806057]
[12]  
Kim J, 2017, INT CONF UBIQ ROBOT, P854
[13]  
Lee GM, 2017, 2017 IEEE INTERNATIONAL CONFERENCE ON MULTISENSOR FUSION AND INTEGRATION FOR INTELLIGENT SYSTEMS (MFI), P64, DOI 10.1109/MFI.2017.8170408
[14]   EPnP: An Accurate O(n) Solution to the PnP Problem [J].
Lepetit, Vincent ;
Moreno-Noguer, Francesc ;
Fua, Pascal .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2009, 81 (02) :155-166
[15]   3D LIDAR-camera intrinsic and extrinsic calibration: Identifiability and analytical least-squares-based initialization [J].
Mirzaei, Faraz M. ;
Kottas, Dimitrios G. ;
Roumeliotis, Stergios I. .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2012, 31 (04) :452-467
[16]  
Pandey G., 2010, IFAC PROC VOL, V43, P336, DOI [10.3182/20100906-3-IT-2019.00059, DOI 10.3182/20100906-3-IT-2019.00059]
[17]   Automatic Extrinsic Calibration of Vision and Lidar by Maximizing Mutual Information [J].
Pandey, Gaurav ;
McBride, James R. ;
Savarese, Silvio ;
Eustice, Ryan M. .
JOURNAL OF FIELD ROBOTICS, 2015, 32 (05) :696-722
[18]   Accurate Calibration of LiDAR-Camera Systems using Ordinary Boxes [J].
Pusztai, Zoltan ;
Hajder, Levente .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW 2017), 2017, :394-402
[19]  
Sui JF, 2017, CHIN CONTR CONF, P6881, DOI 10.23919/ChiCC.2017.8028441
[20]   A flexible new technique for camera calibration [J].
Zhang, ZY .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2000, 22 (11) :1330-1334