1 November 2011 Dynamic (de)focused projection for three-dimensional reconstruction
Intuon Lertrusdachakul, Yohan D. Fougerolle, Olivier Laligant
Author Affiliations +
Abstract
We present a novel 3-D recovery method based on structured light. This method unifies depth from focus (DFF) and depth from defocus (DFD) techniques with the use of a dynamic (de)focused projection. With this approach, the image acquisition system is specifically constructed to keep a whole object sharp in all the captured images. Therefore, only the projected patterns experience different defocused deformations according to the object's depths. When the projected patterns are out of focus, their point-spread function (PSF) is assumed to follow a Gaussian distribution. The final depth is computed by the analysis of the relationship between the sets of PSFs obtained from different blurs and the variation of the object's depths. Our new depth estimation can be employed as a stand-alone strategy. It has no problem with occlusion and correspondence issues. Moreover, it handles textureless and partially reflective surfaces. The experimental results on real objects demonstrate the effective performance of our approach, providing reliable depth estimation and competitive time consumption. It uses fewer input images than DFF, and unlike DFD, it ensures that the PSF is locally unique.
©(2011) Society of Photo-Optical Instrumentation Engineers (SPIE)
Intuon Lertrusdachakul, Yohan D. Fougerolle, and Olivier Laligant "Dynamic (de)focused projection for three-dimensional reconstruction," Optical Engineering 50(11), 113201 (1 November 2011). https://doi.org/10.1117/1.3644541
Published: 1 November 2011
Lens.org Logo
CITATIONS
Cited by 11 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
3D modeling

Calibration

Point spread functions

Projection systems

Sensors

Cameras

Optical engineering

RELATED CONTENT


Back to Top