Distance error correction for time-of-flight cameras

被引:2
|
作者
Fuersattel, Peter [1 ,2 ]
Schaller, Christian [2 ]
Maier, Andreas [1 ]
Riess, Christian [1 ]
机构
[1] Friedrich Alexander Univ Erlangen Nuremberg, Pattern Recognit Lab, Martensstr 3, Erlangen, Germany
[2] Metrilus GmbH, Henkestr 91, Erlangen, Germany
关键词
Time-of-Flight; Systematic Errors; Random Forests; CALIBRATION;
D O I
10.1117/12.2271775
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The measurement accuracy of time-of-flight cameras is limited due to properties of the scene and systematic errors. These errors can accumulate to multiple centimeters which may limit the applicability of these range sensors. In the past, different approaches have been proposed for improving the accuracy of these cameras. In this work, we propose a new method that improves two important aspects of the range calibration. First, we propose a new checkerboard which is augmented by a gray-level gradient. With this addition it becomes possible to capture the calibration features for intrinsic and distance calibration at the same time. The gradient strip allows to acquire a large amount of distance measurements for different surface reflectivities, which results in more meaningful training data. Second, we present multiple new features which are used as input to a random forest regressor. By using random regression forests, we circumvent the problem of finding an accurate model for the measurement error. During application, a correction value for each individual pixel is estimated with the trained forest based on a specifically tailored feature vector. With our approach the measurement error can be reduced by more than 40% for the Mesa SR4000 and by more than 30% for the Microsoft Kinect V2. In our evaluation we also investigate the impact of the individual forest parameters and illustrate the importance of the individual features.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Specularity Detection Using Time-of-Flight Cameras
    Mufti, Faisal
    Mahony, Robert
    COMPUTER ANALYSIS OF IMAGES AND PATTERNS: 14TH INTERNATIONAL CONFERENCE, CAIP 2011, PT 2, 2011, 6855 : 196 - 203
  • [22] Compensation of Motion Artifacts for Time-of-Flight Cameras
    Lindner, Marvin
    Kolb, Andreas
    DYNAMIC 3D IMAGING, PROCEEDINGS, 2009, 5742 : 16 - 27
  • [23] Phase Messaging Method for Time-of-flight Cameras
    Yuan, Wenjia
    Howard, Richard E.
    Dana, Kristin J.
    Raskar, Ramesh
    Ashok, Ashwin
    Gruteser, Marco
    Mandayam, Narayan
    2014 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL PHOTOGRAPHY (ICCP), 2014,
  • [24] Interference Model of Two Time-Of-Flight Cameras
    Wermke, Felix
    Meffert, Beate
    2019 IEEE SENSORS, 2019,
  • [25] Shadow Segmentation Using Time-of-Flight Cameras
    Mufti, Faisal
    Mahony, Robert
    IMAGE ANALYSIS AND PROCESSING - ICIAP 2011, PT I, 2011, 6978 : 78 - 87
  • [26] RANGE UNFOLDING FOR TIME-OF-FLIGHT DEPTH CAMERAS
    Choi, Ouk
    Lim, Hwasup
    Kang, Byongmin
    Kim, Yong Sun
    2010 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, 2010, : 4189 - 4192
  • [27] Development of Tachyon Time-of-Flight PET Cameras
    Peng, Qiyu
    Moses, William
    Zhang, Xuezhu
    Qi, Jinyi
    Zhao, Zhixiang
    Huang, Qiu
    Zhu, Yicheng
    Sui, Tengjie
    Yang, Mingming
    Xu, Jianfeng
    2016 IEEE NUCLEAR SCIENCE SYMPOSIUM, MEDICAL IMAGING CONFERENCE AND ROOM-TEMPERATURE SEMICONDUCTOR DETECTOR WORKSHOP (NSS/MIC/RTSD), 2016,
  • [28] Error correction of depth images for multiview time-of-flight vision sensors
    He, Yu
    Chen, Shengyong
    INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2020, 17 (04)
  • [29] Time-of-Flight Cameras with Multiple Distributed Illumination Units
    Lottner, O.
    Weihs, W.
    Hartmann, K.
    ISCGAV'08: PROCEEDINGS OF THE 8TH WSEAS INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING, COMPUTATIONAL GEOMETRY AND ARTIFICIAL VISION, 2008, : 40 - 45
  • [30] Accuracy of Relative Navigation Using Time-of-Flight Cameras
    Grishin, Vladimir A.
    JOURNAL OF SPACECRAFT AND ROCKETS, 2023, 60 (02) : 471 - 480