Back to Search
Start Over
Depth Estimation Using Single Camera with Dual Apertures
- Source :
- Smart Sensors at the IoT Frontier ISBN: 9783319553443
- Publication Year :
- 2017
- Publisher :
- Springer International Publishing, 2017.
-
Abstract
- Depth sensing is an active area of research in imaging technology. Here, we use a dual-aperture system to infer depth from a single image based on the principle of depth from defocus (DFD). Dual-aperture camera includes a small all-pass aperture (which allows all light through the aperture) and a larger RGB-pass aperture (which allows visible light only). IR image captured through the smaller aperture is sharper than the RGB image captured through the large aperture. Since the difference of blurriness between two images is dependent on the actual distance, using a dual-aperture camera provides an opportunity to estimate depth of a scene. Measuring the absolute blur size is difficult, since it is affected by illuminant’s spectral distribution, noise, specular highlight, vignetting, etc. By using a dual-aperture camera, however, the relative blurriness can be measured in a robust way. In this article,, a detailed description of extracting depth using a dual-aperture camera is provided which includes procedures for fixing each of artifacts that degrade the depth quality based on DFD. Experimental results confirm the improved depth extraction by employing the aforementioned schemes.
- Subjects :
- Point spread function
Vignetting
business.industry
Aperture
Computer science
Astrophysics::Instrumentation and Methods for Astrophysics
Standard illuminant
Depth map
Computer Science::Computer Vision and Pattern Recognition
Specular highlight
Physics::Accelerator Physics
Computer vision
Artificial intelligence
Depth of field
business
Camera resectioning
Subjects
Details
- ISBN :
- 978-3-319-55344-3
- ISBNs :
- 9783319553443
- Database :
- OpenAIRE
- Journal :
- Smart Sensors at the IoT Frontier ISBN: 9783319553443
- Accession number :
- edsair.doi...........46691901ab25fee469f43c2dcc3dbe32
- Full Text :
- https://doi.org/10.1007/978-3-319-55345-0_7