Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera

被引:3
作者
Rahkonen, Samuli [1 ]
Lind, Leevi [1 ]
Raita-Hakola, Anna-Maria [1 ]
Kiiskinen, Sampsa [1 ]
Polonen, Ilkka [1 ]
机构
[1] Univ Jyvaskyla, Fac Informat Technol, Jyvaskyla 40014, Finland
关键词
hyperspectral; depth data; kinect; sensor fusion; reflectance;
D O I
10.3390/s22228668
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Hyperspectral imaging and distance data have previously been used in aerial, forestry, agricultural, and medical imaging applications. Extracting meaningful information from a combination of different imaging modalities is difficult, as the image sensor fusion requires knowing the optical properties of the sensors, selecting the right optics and finding the sensors' mutual reference frame through calibration. In this research we demonstrate a method for fusing data from Fabry-Perot interferometer hyperspectral camera and a Kinect V2 time-of-flight depth sensing camera. We created an experimental application to demonstrate utilizing the depth augmented hyperspectral data to measure emission angle dependent reflectance from a multi-view inferred point cloud. We determined the intrinsic and extrinsic camera parameters through calibration, used global and local registration algorithms to combine point clouds from different viewpoints, created a dense point cloud and determined the angle dependent reflectances from it. The method could successfully combine the 3D point cloud data and hyperspectral data from different viewpoints of a reference colorchecker board. The point cloud registrations gained 0.29-0.36 fitness for inlier point correspondences and RMSE was approx. 2, which refers a quite reliable registration result. The RMSE of the measured reflectances between the front view and side views of the targets varied between 0.01 and 0.05 on average and the spectral angle between 1.5 and 3.2 degrees. The results suggest that changing emission angle has very small effect on the surface reflectance intensity and spectrum shapes, which was expected with the used colorchecker.
引用
收藏
页数:18
相关论文
共 18 条
  • [1] TIME-OF-FLIGHT SENSOR FUSION WITH DEPTH MEASUREMENT RELIABILITY WEIGHTING
    Schwarz, Sebastian
    Sjostrom, Marten
    Olsson, Roger
    2014 3DTV-CONFERENCE: THE TRUE VISION - CAPTURE, TRANSMISSION AND DISPLAY OF 3D VIDEO (3DTV-CON), 2014,
  • [2] Object Tracking with a Fusion of Event-Based Camera and Frame-Based Camera
    Sun, Haixin
    Fremont, Vincent
    INTELLIGENT SYSTEMS AND APPLICATIONS, VOL 2, 2023, 543 : 250 - 264
  • [3] Time-of-Flight Sensor Calibration for a Color and Depth Camera Pair
    Jung, Jiyoung
    Lee, Joon-Young
    Jeong, Yekeun
    Kweon, In So
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2015, 37 (07) : 1501 - 1513
  • [4] Spatio-Temporal Fusion Spiking Neural Network for Frame-Based and Event-Based Camera Sensor Fusion
    Qiao, Guanchao
    Ning, Ning
    Zuo, Yue
    Zhou, Pujun
    Sun, Mingliang
    Hu, Shaogang
    Yu, Qi
    Liu, Yang
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (03): : 2446 - 2456
  • [5] Angle-dependent spectral reflectance material dataset based on 945 nm time-of-flight camera measurements
    Ritter, David J.
    Rott, Relindis
    Schlager, Birgit
    Muckenhuber, Stefan
    Genser, Simon
    Kirchengast, Martin
    Hennecke, Marcus
    DATA IN BRIEF, 2023, 48
  • [6] High Performance Time-of-Flight and Color Sensor Fusion with Image-Guided Depth Super Resolution
    Plank, Hannes
    Holweg, Gerald
    Herndl, Thomas
    Druml, Norbert
    PROCEEDINGS OF THE 2016 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE), 2016, : 1213 - 1218
  • [7] TIME-OF-FLIGHT BASED SCENE RECONSTRUCTION WITH A MESH PROCESSING TOOL FOR MODEL BASED CAMERA TRACKING
    Kahn, Svenja
    Wuest, Harald
    Fellner, Dieter W.
    VISAPP 2010: PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON COMPUTER VISION THEORY AND APPLICATIONS, VOL 1, 2010, : 302 - 309
  • [8] Time-of-Flight Camera Based Virtual Reality Interaction for Balance Rehabilitation Purposes
    Avola, Danilo
    Cinque, Luigi
    Levialdi, Stefano
    Petracca, Andrea
    Placidi, Giuseppe
    Spezialetti, Matteo
    COMPUTATIONAL MODELING OF OBJECTS PRESENTED IN IMAGES: FUNDAMENTALS, METHODS, AND APPLICATIONS, 2014, 8641 : 363 - +
  • [9] Sensor Fusion of 3D Time-of-Flight and Thermal Infrared Camera for Presence Detection of Living Beings
    Oppliger, Moritz
    Gutknecht, Jonas
    Gubler, Roman
    Ludwig, Matthias
    Loeliger, Teddy
    2022 IEEE SENSORS, 2022,
  • [10] Fusion of Time-of-Flight Based Sensors with Monocular Cameras for a Robotic Person Follower
    Sarmento, Jose
    dos Santos, Filipe Neves
    Aguiar, Andre Silva
    Filipe, Vitor
    Valente, Antonio
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2024, 110 (01)