Metrological analysis of the three-dimensional reconstruction based on close-range photogrammetry and the fusion of long-wave infrared and visible-light images

被引:0
作者
de Oliveira, Bernardo C. F. [1 ]
Marcellino, Guilherme C. [1 ]
da Rosa, Pablo A. [1 ]
Pinto, Tiago L. F. C. [1 ]
机构
[1] Univ Fed Santa Catarina, Campus UFSC,POB 5053, BR-88040970 Florianopolis, SC, Brazil
关键词
photogrammetry; three-dimensional reconstruction; image fusion; long-wave infrared thermography;
D O I
10.1088/1361-6501/abb273
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
This work proposes evaluating statistically the metrological performance of three-dimensional reconstructions built with fused long-wavelength infrared (LWIR) and visible-light (VL) images. The image fusion procedure was essentially based on two-dimensional wavelet transform and two pixel-level fusion rules: the maximum intensity level, presented in a previous work of the authors, and a new fusion rule, which replaces the VL information with the LWIR information in the region of the measured object on the images. The reconstructions of a translucent cube were performed with a point triangulation-based procedure and its dimension measurements were employed as evaluation criteria. The results show that the fused images have more contrast but also more artifacts. The fusion procedures generated denser reconstructions with at least 34.83% more points. Considering the metrological result, reconstructions with only visible-light images resulted in maximal 89.31% less measurement bias but at least 47.25% more uncertainty than the fusion ones. The new fusion rule provided the best results, with more points in the dense cloud and lower uncertainty. The work is important to provide a metrologically viable alternative for three-dimensional reconstruction of objects in situations of low contrast or poor texture information in the visible spectrum, and in which no target can be applied to the inspected part.
引用
收藏
页数:13
相关论文
共 36 条
  • [1] Data fusion and multisource image classification
    Amarsaikhan, D
    Douglas, T
    [J]. INTERNATIONAL JOURNAL OF REMOTE SENSING, 2004, 25 (17) : 3529 - 3539
  • [2] Multi-focus image fusion using multi-scale image decomposition and saliency detection
    Bavirisetti, Durga Prasad
    Dhuli, Ravindra
    [J]. AIN SHAMS ENGINEERING JOURNAL, 2018, 9 (04) : 1103 - 1117
  • [3] Automatic registration of fused lidar/digital imagery (texel images) for three- dimensional image creation
    Budge, Scott E.
    Badamikar, Neeraj S.
    Xie, Xuan
    [J]. OPTICAL ENGINEERING, 2015, 54 (03)
  • [4] Burnstock A., 2014, Issues in Contemporary Oil Paint, P1, DOI [10.1007/978-3-319-10100-2_1, DOI 10.1007/978-3-319-10100-2_1]
  • [5] Multisensor data fusion via Gaussian process models for dimensional and geometric verification
    Colosimo, Bianca Maria
    Pacella, Massimo
    Senin, Nicola
    [J]. PRECISION ENGINEERING-JOURNAL OF THE INTERNATIONAL SOCIETIES FOR PRECISION ENGINEERING AND NANOTECHNOLOGY, 2015, 40 : 199 - 213
  • [6] Photogrammetry using visible, infrared, hyperspectral and thermal imaging of crime scenes
    Edelman, G. J.
    Aalders, M. C.
    [J]. FORENSIC SCIENCE INTERNATIONAL, 2018, 292 : 181 - 189
  • [7] An UAS-assisted multi-sensor approach for 3D modeling and reconstruction of cultural heritage site
    Erenoglu, Ramazan Cuneyt
    Akcay, Ozgun
    Erenoglu, Oya
    [J]. JOURNAL OF CULTURAL HERITAGE, 2017, 26 : 79 - 90
  • [8] Hoegner L., 2016, 2016 12 IEEE INT S E, P1
  • [9] Hornberg A., 2006, Handbook of Machine Vision
  • [10] Iwaszczuk D., 2011, 2011 JOINT URB REM S, pPP 1