3D depth sensing of active structured light field based on EPI

被引:0
作者
Lin YingSi [1 ]
Wang Meng [1 ]
Wang Ziwei [1 ]
Tang Qijian [1 ]
Liu Xiaoli [1 ]
机构
[1] Shenzhen Univ, Minist Educ & Guangdong Prov, Key Lab Optoelect Devices & Syst, Coll Phys & Optoelect Engn, Shenzhen 518060, Peoples R China
来源
OPTICAL SENSORS 2021 | 2021年 / 11772卷
基金
中国国家自然科学基金;
关键词
light field; epipolar plane image; fringe projection; depth estimation; calibration; DEFOCUS;
D O I
10.1117/12.2592526
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper, a three-dimensional (3D) depth sensing system based on active structured light field imaging (ALF) is proposed. In light field imaging, one of most commonly used method for depth estimation is based on its Epipolar Plane Image (EPI), in which the slope of line features is related to parallax and is inversely proportional to the depth of the measured object. However, it is difficult to extract the line features accurately only according to the captured texture information of the object, especially in the case of weak texture, repeated texture and noise. Therefore, active phase feature provided by a phase-shifting fringe projection is introduced for this system, with which the line features in EPI can be extracted by simply searching correspondence points with the same phase value. In order to obtain depth map with measuring accuracy, a metric calibration method is proposed to establish the quantitative relationship between the slope of lines and depth. Besides that, due to the existence of distortions in the light field camera (LFC), the correspondence points in EPI cannot fit well enough with linear distribution, another calibration based on the LFC imaging model and Bundle Adjustment (BA) was implemented to correct distortions in the EPI, which can reduce the fitting errors of line features. experiment results proved that calibration method described above is effective, and the built ALF system sensor can work well for 3D depth estimation.
引用
收藏
页数:7
相关论文
共 16 条
[1]   Light field 3D measurement using unfocused plenoptic cameras [J].
Cai, Zewei ;
Liu, Xiaoli ;
Tang, Qijian ;
Peng, Xiang ;
Gao, Bruce Zhi .
OPTICS LETTERS, 2018, 43 (15) :3746-3749
[2]  
Gershun AA., 1936, J MATH PHYS, V18, P51, DOI [DOI 10.1002/SAPM193918151, 10.1002/sapm193918151]
[3]   Refocusing distance of a standard plenoptic camera [J].
Hahne, Christopher ;
Aggoun, Amar ;
Velisavljevic, Vladan ;
Fiebig, Susanne ;
Pesch, Matthias .
OPTICS EXPRESS, 2016, 24 (19) :21521-21540
[4]  
Huijin L. V., 2015, LIGHT FIELD DEPTH ES
[5]  
Jeon HG, 2015, PROC CVPR IEEE, P1547, DOI 10.1109/CVPR.2015.7298762
[6]   What Sparse Light Field Coding Reveals about Scene Structure [J].
Johannsen, Ole ;
Sulc, Antonin ;
Goldluecke, Bastian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :3262-3270
[7]  
Lumsdaine A., 2009, P IEEE INT C COMP PH, P1
[8]  
Ng M. L. R., 2005, LIGHT FIELD PHOTOGRA
[9]   Occlusion-aware depth estimation for light field using multi-orientation EPIs [J].
Sheng, Hao ;
Zhao, Pan ;
Zhang, Shuo ;
Zhang, Jun ;
Yang, Da .
PATTERN RECOGNITION, 2018, 74 :587-599
[10]   Shape Estimation from Shading, Defocus, and Correspondence Using Light-Field Angular Coherence [J].
Tao, Michael W. ;
Srinivasan, Pratul P. ;
Hadap, Sunil ;
Rusinkiewicz, Szymon ;
Malik, Jitendra ;
Ramamoorthi, Ravi .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2017, 39 (03) :546-560