SD-SEM: sparse-dense correspondence for 3D reconstruction of microscopic samples

被引:7
作者
Baghaie, Ahmadreza [1 ]
Tafti, Ahmad P. [2 ]
Owen, Heather A. [3 ]
D'Souza, Roshan M. [4 ]
Yu, Zeyun [1 ,5 ]
机构
[1] Univ Wisconsin, Dept Elect Engn, Madison, WI 53706 USA
[2] Marshfield Clin Res Fdn, Biomed Informat Res Ctr, Marshfield, WI USA
[3] Univ Wisconsin, Dept Biol Sci, Madison, WI 53706 USA
[4] Univ Wisconsin, Dept Mech Engn, Madison, WI 53706 USA
[5] Univ Wisconsin, Dept Comp Sci, Madison, WI 53706 USA
关键词
Scanning electron microscope (SEM); 3D reconstruction; Feature descriptors; Dense correspondence; OPTICAL-FLOW ESTIMATION; SURFACE RECONSTRUCTION; STEREO; FEATURES;
D O I
10.1016/j.micron.2017.03.009
中图分类号
TH742 [显微镜];
学科分类号
摘要
Scanning electron microscopy (SEM) imaging has been a principal component of many studies in biomedical, mechanical, and materials sciences since its emergence. Despite the high resolution of captured images, they remain two-dimensional (2D). In this work, a novel framework using sparse-dense correspondence is introduced and investigated for 3D reconstruction of stereo SEM images. SEM micrographs from microscopic samples are captured by tilting the specimen stage by a known angle. The pair of SEM micrographs is then rectified using sparse scale invariant feature transform (SIFT) features/descriptors and a contrario RANSAC for matching outlier removal to ensure a gross horizontal displacement between corresponding points. This is followed by dense correspondence estimation using dense SIFT descriptors and employing a factor graph representation of the energy minimization functional and loopy belief propagation (LBP) as means of optimization. Given the pixel-by-pixel correspondence and the tilt angle of the specimen stage during the acquisition of micrographs, depth can be recovered. Extensive tests reveal the strength of the proposed method for high-quality reconstruction of microscopic samples. (C) 2017 Elsevier Ltd. All rights reserved.
引用
收藏
页码:41 / 55
页数:15
相关论文
共 76 条
[1]   Building Rome in a Day [J].
Agarwal, Sameer ;
Furukawa, Yasutaka ;
Snavely, Noah ;
Simon, Ian ;
Curless, Brian ;
Seitz, Steven M. ;
Szeliski, Richard .
COMMUNICATIONS OF THE ACM, 2011, 54 (10) :105-112
[2]  
Albouy B., 2004, Advanced Concepts for Intelligent Vision Systems
[3]  
Baghaie A., 2016, ARXIV160906341
[4]   Dense descriptors for optical flow estimation: A comparative study [J].
Baghaie, Ahmadreza ;
D'Souza, Roshan M. ;
Yu, Zeyun .
Journal of Imaging, 2017, 3 (01)
[5]   Dense Correspondence and Optical Flow Estimation Using Gabor, Schmid and Steerable Descriptors [J].
Baghaie, Ahmadreza ;
D'Souza, Roshan M. ;
Yu, Zeyun .
ADVANCES IN VISUAL COMPUTING, PT I (ISVC 2015), 2015, 9474 :406-415
[6]  
Baghaie A, 2014, LECT NOTES COMPUT SC, V8888, P1, DOI 10.1007/978-3-319-14364-4_1
[7]  
Baghaie A, 2014, LECT NOTES COMPUT SC, V8641, P69, DOI 10.1007/978-3-319-09994-1_7
[8]   A database and evaluation methodology for optical flow [J].
Baker, Simon ;
Scharstein, Daniel ;
Lewis, J. P. ;
Roth, Stefan ;
Black, Michael J. ;
Szeliski, Richard .
2007 IEEE 11TH INTERNATIONAL CONFERENCE ON COMPUTER VISION, VOLS 1-6, 2007, :588-595
[9]  
BARBER D., 2012, Bayesian Reasoning and Machine Learning
[10]   Shape, Illumination, and Reflectance from Shading [J].
Barron, Jonathan T. ;
Malik, Jitendra .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2015, 37 (08) :1670-1687