Real-time dense-view imaging for three-dimensional light-field display based on image color calibration and self-supervised view synthesis

被引:17
作者
Guo, Xiao [1 ]
Sang, Xinzhu [1 ]
Yan, Binbin [1 ]
Wang, Huachun [1 ]
Ye, Xiaoqian [1 ]
Chen, Shuo [1 ]
Wan, Huaming [1 ]
Li, Ningchi [1 ]
Zeng, Zhehao [1 ]
Chen, Duo [1 ]
Wang, Peng [1 ]
Xing, Shujun [1 ]
机构
[1] Beijing Univ Posts & Telecommun, State Key Lab Informat Photon & Opt Commun, Beijing 100876, Peoples R China
基金
中国国家自然科学基金;
关键词
Compilation and indexing terms; Copyright 2024 Elsevier Inc;
D O I
10.1364/OE.461789
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Three-Dimensional (3D) light-field display has achieved promising improvement in recent years. However, since the dense-view images cannot be collected fast in real-world 3D scenes, the real-time 3D light-field display is still challenging to achieve in real scenes, especially at the high-resolution 3D display. Here, a real-time 3D light-field display method with dense-view is proposed based on image color correction and self-supervised optical flow estimation, and a high-quality and high frame rate of 3D light-field display can be realized simultaneously. A sparse camera array is firstly used to capture sparse-view images in the proposed method. To eliminate the color deviation of the sparse views, the imaging process of the camera is analyzed, and a practical multi-layer perception (MLP) network is proposed to perform color calibration. Given sparse views with consistent color, the optical flow can be estimated by a lightweight convolutional neural network (CNN) at high speed, which uses the input image pairs to learn the optical flow in a self-supervised manner. With inverse warp operation, dense-view images can be synthesized in the end. Quantitative and qualitative experiments are performed to evaluate the feasibility of the proposed method. Experimental results show that over 60 dense-view images at a resolution of 1024 x 512 can be generated with 11 input views at a frame rate over 20 fps, which is 4x faster than previous optical flow estimation methods PWC-Net and LiteFlowNet3. Finally, large viewing angles and high-quality 3D light-field display at 3840 x 2160 resolution can be achieved in real-time. (C) 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement
引用
收藏
页码:22260 / 22276
页数:17
相关论文
empty
未找到相关数据