Unified Calibration for Multi-camera Multi-LiDAR Systems using a Single Checkerboard

被引:8
作者
Lee, Wonmyung [1 ]
Won, Changhee [1 ,2 ]
Lim, Jongwoo [1 ,2 ]
机构
[1] Hanyang Univ, Dept Comp Sicence, Seoul, South Korea
[2] MultiplEYE Co Ltd, Seoul, South Korea
来源
2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) | 2020年
基金
新加坡国家研究基金会;
关键词
COLOR CAMERA;
D O I
10.1109/IROS45743.2020.9340946
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose a unified calibration method for multi-camera multi-LiDAR systems. Only using a single planar checkerboard, the captured checkerboard frames by each sensor are classified as either global frames if they are observed by at least two sensors, or a local frame if observed by a single camera. Both global and local frames of each camera are used to estimate its intrinsic parameters, whereas the global frames between sensors are for computing their relative poses. In contrast to the previous methods that simply combine the pairwise poses (e.g., camera-to-camera or camera-to-LiDAR) that are separately estimated, we further optimize the sensor poses in the system globally using all observations as the constraints in the optimization problem. We find that the point-to-plane distances are effective as camera-to-LiDAR constraints where the points are 3D positions of the checkerboard corners and the planes are estimated from the LiDAR point-cloud. Also, abundant corner observations in the local frames enable the joint optimization of intrinsic and extrinsic parameters in a unified framework. The proposed calibration method utilizes entire observations in a unified global optimization framework, and it significantly reduces the error caused by a simple composition of the relative sensor poses. We extensively evaluate the proposed algorithm qualitatively and quantitatively using real and synthetic datasets. We plan to make the implementation open to the public with the paper publication.
引用
收藏
页码:9033 / 9039
页数:7
相关论文
共 32 条
[1]  
Bileschi Stanley, 2009, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, P1457, DOI 10.1109/ICCVW.2009.5457439
[2]  
Caesar H., 2019, ABS190311027 CORR, P11621
[3]  
Chai ZQ, 2018, IEEE ASME INT C ADV, P286, DOI 10.1109/AIM.2018.8452339
[4]   Multi-View 3D Object Detection Network for Autonomous Driving [J].
Chen, Xiaozhi ;
Ma, Huimin ;
Wan, Ji ;
Li, Bo ;
Xia, Tian .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :6526-6534
[5]   RANDOM SAMPLE CONSENSUS - A PARADIGM FOR MODEL-FITTING WITH APPLICATIONS TO IMAGE-ANALYSIS AND AUTOMATED CARTOGRAPHY [J].
FISCHLER, MA ;
BOLLES, RC .
COMMUNICATIONS OF THE ACM, 1981, 24 (06) :381-395
[6]  
Geiger A, 2012, IEEE INT CONF ROBOT, P3936, DOI 10.1109/ICRA.2012.6224570
[7]  
Graeter J, 2018, IEEE INT C INT ROBOT, P7872, DOI 10.1109/IROS.2018.8594394
[8]  
Heng L, 2019, IEEE INT CONF ROBOT, P4695, DOI [10.1109/icra.2019.8793949, 10.1109/ICRA.2019.8793949]
[9]   Joint Depth and Color Camera Calibration with Distortion Correction [J].
Herrera, Daniel C. ;
Kannala, Juho ;
Heikkila, Janne .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2012, 34 (10) :2058-2064
[10]   The Road is Enough! Extrinsic Calibration of Non-overlapping Stereo Camera and LiDAR using Road Information [J].
Jeong, Jinyong ;
Cho, Younghun ;
Kim, Ayoung .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2019, 4 (03) :2831-2838