Imaging Components, Systems, and Processing

Extended depth-of-field in integral-imaging pickup process based on amplitude-modulated sensor arrays

[+] Author Affiliations
Cheng-Gao Luo, Qiong-Hua Wang, Huan Deng, Yao Liu

Sichuan University, School of Electronics and Information Engineering, No. 24 South Section 1, Yihuan Road, Chengdu 610065, China

Opt. Eng. 54(7), 073108 (Jul 23, 2015). doi:10.1117/1.OE.54.7.073108
History: Received January 1, 2015; Accepted June 29, 2015
Text Size: A A A

Open Access Open Access

Abstract.  We implement a depth-of-field (DOF) extending pickup experiment of integral imaging based on amplitude-modulated sensor arrays (SAs). By implementing the amplitude-modulating technique on the SA in the optical pickup process, we can modulate the light intensity distribution in the imaging space. Therefore, the central maximum of the Airy pattern becomes narrower and the DOF is enlarged. The experimental results obtained from the optical pickup process and the computational reconstruction process demonstrate the effectiveness of the DOF extending method. We present that the DOF extending pickup method is more suitable for enhancing the DOF of three-dimensional scenes with small depth ranges.

Figures in this Article

Integral imaging is a three-dimensional (3-D) sensing and display technique, which was first proposed by Lippmann in 1908.1 Unlike stereoscopic 3-D display or holography,26 integral imaging can provide full-parallax and continuous-viewing 3-D images without using any special glasses or coherent light.711 An integral-imaging system utilizes a microlens array (MLA) to pick up and reconstruct lifelike true 3-D images. One of the main problems in integral imaging is its limited depth-of-field (DOF); many researchers have proposed some useful methods to improve this.1216 One approach to overcome this problem is to reduce the numerical aperture of the microlenses. However, such a reduction would reduce the lateral resolution of the elemental images.17

It has been known for some time that obstructing the center of the aperture of an optical system—i.e., using an annular aperture—makes the central maximum of the Airy pattern narrower and increases the DOF.18,19 Also, Martínez-Corral et al.20 proposed an amplitude-modulating method and presented a simulating experiment to show that the DOF of an integral-imaging pickup system was significantly enhanced by simply placing an opaque circular mask behind each microlens. But, to our knowledge, no optical integral-imaging pickup experiments have been presented to verify this method so far.

In this paper, we analyze the light intensity distribution in the amplitude-modulating pickup system and implement an optical pickup experiment using an amplitude-modulated sensor array (SA) to generate the DOF-enhanced elemental image array (EIA). The obtained EIA is then used for computational reconstruction to produce the 3-D images with extended DOF.

We assume that the integral-imaging system is linear and shift invariant and is illuminated by a monochromatic light source with a wave length λ.

Figure 1 shows the schematic of the DOF-extending method in the integral-imaging pickup process and an opaque mask with diameter q is placed in front of each sensor to obstruct the central part of that sensor. The 3-D object point O(x0,y0) is located out of the reference plane with a depth value z0 and produces a blurred image on the complementary metal-oxide-semiconductor (CMOS). The central sensor is denoted as the (0th, 0th) sensor. The pitch of the SA is p and the focal length of the SA is f. Distances l and g are related by the Gaussian lens law 1/l+1/g1/f=0.

Graphic Jump Location
Fig. 1
F1 :

Schematic of the depth-of-field (DOF) extending method in integral-imaging pickup process.

Pupil function for the (0th, 0th) sensor in Fig. 1 is given by Display Formula

P00(x,y)=Circ(x,y;p)Circ(x,y;q),(1)
where p>q0, function Display Formula
Circ(x,y;p)={1x2+y2p/20otherwise(2)
represents the amplitude transmittance of a circular aperture.

Accordingly, the pupil function for the (m’th, n’th) sensor can be expressed as Pmn(x,y)=P00(xmp,ynp). Here, m and n account for the indices of that sensor. For any monochromatic channel with a wavelength λ, the phase transformation of the (m’th, n’th) sensor is written as Display Formula

Tmn(x,y)=P00(xmp,ynp)exp{jk2f[(xmp)2+(ynp)2]},(3)
where k refers to the wave number and is given by k=2π/λ.

According to the paraxial approximation and the Fresnel diffraction theory, the light intensity distribution on the CMOS (x,y) can be obtained as Display Formula

I(x,y;z0)=|1λ2g(l+z0)++exp{jk2(l+z0)[(xx0)2+(yy0)2]}×Tmn(x,y)×exp{jk2g[(xx)2+(yy)2]}dxdy|2.(4)

Here, the external pure phase factors have been dropped.

Note that the DOF-extending method works at the expense of losing the light efficiency, and the light efficiency is given by Display Formula

η=p2q2p2×100%.(5)

For the DOF calculation, we only take into account the rear DOF that is behind the reference plane. As for an object point with a certain depth value z0, its diffraction intensity pattern on the CMOS can be computed by using Eq. (4) and for the diffraction intensity pattern, we define its diameter as the one of a circle where the intensity has dropped by a factor of 1/21/2 compared to the maximum intensity of that pattern. Similarly, for different object points with different depth values z0, we can work out a group of distinct diameters. Therefore, when the system parameters are given as p=8.8mm, q=6.2mm, f=50.0mm, g=60.4mm, λ=5.5×104mm, we calculate two groups of pattern diameters with different depth values z0 for the DOF-extending method (red line) and the conventional method (green line), respectively, as shown in Fig. 2. From Fig. 2, we can see that the minimal pattern diameter on the CMOS is obtained when the object point is located on the reference plane (z0=0), and as the object point goes away from the reference plane (z0 increases), the diameter gradually increases.

Graphic Jump Location
Fig. 2
F2 :

DOF of the DOF-extending method and the conventional method.

The DOF of the integral-imaging pickup system can be defined as the distance in which the object may be axially shifted before an intolerable blur is produced.21 The size of the critical tolerable pattern is given by the combination of the least distance of distinct vision of a normal eye (about 250.0 mm)22 and the minimum angular resolution of human eyes (about 2.9×104rad),17 and the tolerable pattern diameter is obtained as 72.5μm (blue line shown in Fig. 2). Therefore, the DOF of both the DOF-extending and conventional methods can be obtained as the abscissas of the intersections of the blue line with the red line and the green line, respectively, as shown in Fig. 2. The results show that the DOF of the DOF-extending method (about 60.0 mm) almost increased by a factor of 2 over one of the conventional methods (about 30.0 mm) with an obscuration ratio q/p1/21/2. The light efficiency can be obtained as 50.4% according to Eq. (5).

To further verify the effectiveness of the DOF extending pickup method based on amplitude-modulated SA, we implemented an optical pickup experiment under white-light illumination and a computational reconstruction experiment. As shown in Fig. 3, a Canon EOS 60D sensor with a Canon EF-S 18 to 55 mm f/3.5 to 5.6 IS II lens was fixed onto a Lyseiki motorized translation stage to perform the optical pickup process. The stage was driven by the stage controller to move on both horizontal and vertical directions step by step with a stepping length of 5.0 mm. The focal length, exposure time, ISO, F-number, and CMOS size were 50.0 mm, 1/25s, 1000, F/5.6, and 22.3mm×14.9mm, respectively. The sensor was focusing on the first object and the distance between the sensor CMOS and the first object was 350.0 mm. According to the Gaussian lens law, the distance between the sensor CMOS and the sensor objective equivalent principal plane was 60.4 mm and the distance between the sensor objective equivalent principal plane and object 1 was 289.6 mm.

Graphic Jump Location
Fig. 3
F3 :

Experimental setup of the optical integral-imaging pickup process.

Note that the opaque mask for amplitude modulating should be placed exactly on the aperture plane of the sensor objective, which was not accessible in our setup. Thus, we introduced an additional diaphragm, whose diameter was set to be 8.8 mm, onto the first surface of the objective to shift the aperture plane to the first surface. As shown in Figs. 4(a) and 4(b), an 8.8-mm aperture stop and a 6.2-mm opaque mask were printed on a photographic film for the optical integral-imaging pickup process. According to the DOF results in Fig. 2 and Sec. 3, the rear DOFs of the DOF-extending and conventional experimental setup can be obtained as 60.0 mm and 30.0 mm, respectively.

Graphic Jump Location
Fig. 4
F4 :

(a) The 8.8-mm aperture stop, (b) the aperture stop and the 6.2-mm opaque mask, and (c) the 3-D object used in the optical integral-imaging pickup process.

As shown in Fig. 4(c), we built a 3-D object which consists of three planar objects located at different depth positions. The distance between every two adjacent objects was 30.0 mm. To make sure that the pickup system has the same angular resolution at three different planar objects, the lateral sizes of object 1, object 2, and object 3 were designed properly as 2.5 mm, 2.8 mm and 3.0 mm, respectively. For each method, there were 7×7 images captured by the SA as the original elemental images and each had a resolution of 5184×3456 pixels. Table 1 shows the parameters used in the experiment.

Table Grahic Jump Location
Table 1Parameters used in the experiment.

Since the virtual MLA used in the computational reconstruction process had a focal length f=22.0mm and a pitch p=5.0mm, the obtained original elemental images need to be resized and shrunk to have a resolution of 1000×1000pixels. Figure 5 shows the obtained EIAs and the enlarged elemental images. Each EIA contains 7×7 elemental images and has a resolution of 7000×7000pixels. It can be seen that the EIA and elemental images obtained from the DOF-extending method are much clear than those of the conventional method, especially for three black fringes with high-contrast ratio. However, the light efficiency has been decreased due to the obscuration of the central part of each sensor.

Graphic Jump Location
Fig. 5
F5 :

Elemental image arrays and enlarged elemental images obtained from (a) the DOF extending and (b) the conventional optical integral-imaging pickup methods.

Also, to demonstrate the DOF-extending effect more intuitively, we took the two enlarged elemental images in Fig. 5, for example, and plotted the normalized intensity profiles of three objects along the sampling path shown in Fig. 6(a). As shown in Fig. 6(b), the normalized intensity profiles of object 1 obtained by the DOF-extending method [red lines in Fig. 6(b)] are quite similar to the one obtained by the conventional method [blue lines in Fig. 6(b)], which means that object 1 is recorded clearly with sharp edges and high-contrast ratio by both methods. However, in Fig. 6(c), the intensity troughs, which refer to the color fringes on object 2, are separated while the intensity peaks, which refer to the blank areas on object 2, are quite close to each other. This is particularly apparent for three intensity troughs in the right part of Fig. 6(c), which represent three black fringes. The above analyses indicate that the image of object 2 obtained from the DOF-extending method has a higher contrast ratio and sharper edges than that obtained from the conventional method, and thus, it is more faithful to the shape of the original object. This can be observed even more obviously in Fig. 6(d). Thus, we can come to a conclusion that the DOF is evidently enhanced by amplitude modulating.

Graphic Jump Location
Fig. 6
F6 :

(a) Intensity sampling path shown with dashed arrow on the object and normalized intensity profiles of (b) object 1, (c) object 2, and (d) object 3 for the two enlarged elemental images shown in Fig. 5.

After obtaining the EIAs, we conducted a computational reconstruction experiment and the reconstructed images obtained at different virtual imaging planes, which were 12.4 mm, 49.4 mm and 85.4 mm away from the original reference plane of the sensor, were shown in Fig. 7. These reconstructed images have been slightly shifted with respect to their theoretical positions due to the experimental errors introduced in the optical pickup process. From the results of the conventional method in Fig. 7(b), we can see that the reconstructed image of object 1 looks quite clear since it locates on the focusing plane of the sensor, object 2 starts getting blurred since it locates on the marginal depth plane, and object 3 is too blurry to be observed since it is located out of the depth range. By contrast, the image of object 3 in Fig. 7(a) is almost as clear as the image of object 2 in Fig. 7(b); therefore, the DOF-extending method has successfully moved the marginal depth plane from object 2 to object 3, which means that the DOF is increased from around 30.0 to 60.0 mm as estimated in Fig. 2 and Sec. 3. What is more, as shown in Fig. 8, normalized intensity profiles of the reconstructed images were obtained by using the same method in Fig. 6. It can be seen that the intensity distributions of these reconstructed images are quite similar to those shown in Fig. 6, and this indicates that the reconstructed images obtained from the DOF-extending method are clearer than those obtained from the conventional method with sharper edges and higher contrast ratio. Thus, the effectiveness of the DOF extending method was finally confirmed. Note that in the experiment, the DOF is enhanced at the expense of losing the light efficiency by a factor of about 49.6%. Therefore, people should be careful when trying to apply this DOF-extending method to situations where light efficiency is highly desired.

Graphic Jump Location
Fig. 7
F7 :

Computational reconstruction results obtained at different virtual imaging planes: (a) the DOF-extending method and (b) the conventional method.

Graphic Jump Location
Fig. 8
F8 :

Normalized intensity profiles for the reconstructed images of (a) object 1, (b) object 2, and (c) object 3.

It is noteworthy that the DOF-extending method has a better performance in recording the high-frequency components of the object information due to its bandpass characteristics. Generally, for a 3-D scene with a small depth range, people pay more attention to the details of the 3-D object, which are mostly resolved by the high-frequency information.23 While for a 3-D scene with a large depth range, the optical transfer function of the DOF-extending method will suffer a severe attenuation and oscillation, which will seriously degrade the image quality.24 Therefore, the DOF-extending pickup method is more suitable for enhancing the DOF of a 3-D scene with a small depth range. After performing the experiment using different 3-D objects with different depth ranges for many times, we find that 60.0 mm [shown in Fig. 4(c)] is almost the largest depth range that can be adopted to demonstrate the effectiveness of the DOF-extending method for the given pickup system shown in Fig. 3. The deeper the 3-D object is, the less effective the DOF-extending method will be.

We have analyzed the light intensity distributions and propagation characteristics of the DOF extending and conventional integral-imaging pickup process. Experimental results of the optical pickup process and the computational reconstruction process have shown that the DOF-extending method works effectively when recording a 3-D scene with a small depth range. Note that in the optical pickup experiment, the DOF is enhanced at the expense of losing the light efficiency. Therefore, people should be careful when trying to apply this DOF-extending method to situations where light efficiency is highly concerned. Also, this method is currently difficult to apply to MLAs for optical pickup or reconstruction due to the limited aperture of each microlens, but with time, it would be possible for manufacturers to fabricate a kind of amplitude-modulating mask onto MLAs. In our future work, DOF-extending methods for recording and displaying 3-D scenes with large depth ranges will be presented.

This work is supported by the “973” Program under Grant No. 2013CB328802, the NSFC under Grant Nos. 61225022 and 61320106015, and the “863” Program under Grant Nos. 2012AA011901 and 2012AA03A301. The authors would like to thank Prof. Manuel Martínez-Corral, Prof. Bahram Javidi, and Dr. Xiao Xiao for valuable suggestions on designing the optical pickup experiment.

Lippmann  G., “La photographie integrale,” Comptes-Rendus Acad. Sci.. 146, , 446 –451 (1908).
Son  J. Y.  et al., “Recent developments in 3-D imaging technologies,” J. Disp. Technol.. 6, (10 ), 394 –403 (2010). 1551-319X CrossRef
Takaki  Y., , Tanaka  K., and Nakamura  J., “Super multi-view display with a lower resolution flat-panel display,” Opt. Express. 19, (5 ), 4129 –4139 (2011). 1094-4087 CrossRef
Xue  G. L.  et al., “Multiplexing encoding method for full-color dynamic 3D holographic display,” Opt. Express. 22, (15 ), 18473 –18482 (2013). 1094-4087 CrossRef
Li  X.  et al., “Video-rate holographic display using azo-dye-doped liquid crystal,” J. Disp. Technol.. 10, (6 ), 438 –443 (2014). 1551-319X CrossRef
Huang  Y. P., , Hsieh  P. Y., and Wu  S. T., “Applications of multidirectional asymmetrical microlens-array light-control films on reflective liquid-crystal displays for image quality enhancement,” Appl. Opt.. 43, (18 ), 3656 –3663 (2004). 0003-6935 CrossRef
Xiao  X.  et al., “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt.. 52, (4 ), 546 –560 (2013). 0003-6935 CrossRef
Zhou  L. Q.  et al., “Voxel model for evaluation of a three-dimensional display and reconstruction in integral imaging,” Opt. Lett.. 39, (7 ), 2032 –2035 (2014). 0146-9592 CrossRef
Wu  F.  et al., “High-optical-efficiency integral imaging display based on gradient-aperture pinhole array,” Opt. Eng.. 52, (5 ), 054002  (2013).CrossRef
Zhang  J. L.  et al., “Feasibility study for pseudoscopic problem in integral imaging using negative refractive index materials,” Opt. Express. 22, (17 ), 20757 –20769 (2014). 1094-4087 CrossRef
Yim  J., , Kim  Y. M., and Min  S. W., “Real object pickup method for real and virtual modes of integral imaging,” Opt. Eng.. 53, (7 ), 073109  (2014).CrossRef
Lim  Y. T.  et al., “Analysis on enhanced depth of field for integral imaging microscope,” Opt. Express. 20, (21 ), 23480 –23488 (2012). 1094-4087 CrossRef
Kim  S. K.  et al., “Evaluation of the monocular depth cue in 3D displays,” Opt. Express. 16, (26 ), 21415 –21422 (2008). 1094-4087 CrossRef
Park  J. H.  et al., “Depth-enhanced three-dimensional-two-dimensional convertible display based on modified integral imaging,” Opt. Lett.. 29, (23 ), 2734 –2736 (2004). 0146-9592 CrossRef
Jang  J. S., and Javidi  B., “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett.. 28, (20 ), 1924 –1926 (2003). 0146-9592 CrossRef
Park  C. K., , Lee  S. S., and Hwang  Y. S., “Depth-extended integral imaging system based on a birefringence lens array providing polarization switchable focal lengths,” Opt. Express. 17, (21 ), 19047 –19054 (2009). 1094-4087 CrossRef
Born  M., and Wolf  E., “Resolving power of imaging-forming systems,” in Principles of Optics. , 7th ed., pp. 461 –465,  Cambridge University ,  Cambridge, United Kingdom  (1999).
Steward  G. C., “Various forms of aperture,” in The Symmetrical Optical System. , Hall  P., and Smithies  F., Eds., pp. 88 –102,  Cambridge University ,  London, United Kingdom  (1958).
Welford  W. T., “Use of annular apertures to increase focal depth,” J. Opt. Soc. Am. A. 50, (8 ), 749 –752 (1960). 0740-3232 CrossRef
Martínez-Corral  M.  et al., “Integral imaging with improved depth of field by use of amplitude-modulated microlens arrays,” Appl. Opt.. 43, (31 ), 5806 –5813 (2004). 0003-6935 CrossRef
Luo  C. G.  et al., “Analysis of the depth of field of integral imaging displays based on wave optics,” Opt. Express. 21, (25 ), 31263 –31273 (2013). 1094-4087 CrossRef
Born  M., and Wolf  E., “The eye,” in Principles of Optics. , 7th ed., pp. 261 –263,  Cambridge University ,  Cambridge, United Kingdom  (1999).
Mino  M., and Okano  Y., “Improvement in the OTF of a defocused optical system through the use of shaded apertures,” Appl. Opt.. 10, (10 ), 2219 –2225 (1971). 0003-6935 CrossRef
O’Neill  E. L., “Transfer function for an annular aperture,” J. Opt. Soc. Am. A. 46, (4 ), 285 –288 (1956). 0030-3941 CrossRef

Cheng-Gao Luo is currently pursuing his PhD in optical engineering at Sichuan University, Chengdu, China. He worked as a visiting research scholar at the University of Connecticut from 2012 to 2013. His recent research interest is information display technologies including 3-D displays.

Qiong-Hua Wang is a professor of optics at the School of Electronics and Information Engineering, Sichuan University, China. She was a postdoctoral research fellow at the School of Optics/CREOL, University of Central Florida, from 2001 to 2004. She has published more than 200 papers on information displays. She is the associate editor of Optics Express and Journal of the Society for Information Display. Her recent research interests include optics and optoelectronics, especially display technologies.

Huan Deng is a lecturer of optics at the School of Electronic and Information Engineering, Sichuan University. She received her PhD from Sichuan University in 2012. She has published more than 10 papers. She is a member of Society for Information Display. Her recent research interest is information display technologies including 3-D displays.

Yao Liu is currently working toward her MS degree in optical engineering at the School of Electronics and Information Engineering, Sichuan University, Chengdu, China. Her current research interest is information display technologies including 3-D displays.

© The Authors. Published by SPIE under a Creative Commons Attribution 3.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.

Citation

Cheng-Gao Luo ; Qiong-Hua Wang ; Huan Deng and Yao Liu
"Extended depth-of-field in integral-imaging pickup process based on amplitude-modulated sensor arrays", Opt. Eng. 54(7), 073108 (Jul 23, 2015). ; http://dx.doi.org/10.1117/1.OE.54.7.073108


Figures

Graphic Jump Location
Fig. 8
F8 :

Normalized intensity profiles for the reconstructed images of (a) object 1, (b) object 2, and (c) object 3.

Graphic Jump Location
Fig. 7
F7 :

Computational reconstruction results obtained at different virtual imaging planes: (a) the DOF-extending method and (b) the conventional method.

Graphic Jump Location
Fig. 6
F6 :

(a) Intensity sampling path shown with dashed arrow on the object and normalized intensity profiles of (b) object 1, (c) object 2, and (d) object 3 for the two enlarged elemental images shown in Fig. 5.

Graphic Jump Location
Fig. 5
F5 :

Elemental image arrays and enlarged elemental images obtained from (a) the DOF extending and (b) the conventional optical integral-imaging pickup methods.

Graphic Jump Location
Fig. 4
F4 :

(a) The 8.8-mm aperture stop, (b) the aperture stop and the 6.2-mm opaque mask, and (c) the 3-D object used in the optical integral-imaging pickup process.

Graphic Jump Location
Fig. 3
F3 :

Experimental setup of the optical integral-imaging pickup process.

Graphic Jump Location
Fig. 2
F2 :

DOF of the DOF-extending method and the conventional method.

Graphic Jump Location
Fig. 1
F1 :

Schematic of the depth-of-field (DOF) extending method in integral-imaging pickup process.

Tables

Table Grahic Jump Location
Table 1Parameters used in the experiment.

References

Lippmann  G., “La photographie integrale,” Comptes-Rendus Acad. Sci.. 146, , 446 –451 (1908).
Son  J. Y.  et al., “Recent developments in 3-D imaging technologies,” J. Disp. Technol.. 6, (10 ), 394 –403 (2010). 1551-319X CrossRef
Takaki  Y., , Tanaka  K., and Nakamura  J., “Super multi-view display with a lower resolution flat-panel display,” Opt. Express. 19, (5 ), 4129 –4139 (2011). 1094-4087 CrossRef
Xue  G. L.  et al., “Multiplexing encoding method for full-color dynamic 3D holographic display,” Opt. Express. 22, (15 ), 18473 –18482 (2013). 1094-4087 CrossRef
Li  X.  et al., “Video-rate holographic display using azo-dye-doped liquid crystal,” J. Disp. Technol.. 10, (6 ), 438 –443 (2014). 1551-319X CrossRef
Huang  Y. P., , Hsieh  P. Y., and Wu  S. T., “Applications of multidirectional asymmetrical microlens-array light-control films on reflective liquid-crystal displays for image quality enhancement,” Appl. Opt.. 43, (18 ), 3656 –3663 (2004). 0003-6935 CrossRef
Xiao  X.  et al., “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt.. 52, (4 ), 546 –560 (2013). 0003-6935 CrossRef
Zhou  L. Q.  et al., “Voxel model for evaluation of a three-dimensional display and reconstruction in integral imaging,” Opt. Lett.. 39, (7 ), 2032 –2035 (2014). 0146-9592 CrossRef
Wu  F.  et al., “High-optical-efficiency integral imaging display based on gradient-aperture pinhole array,” Opt. Eng.. 52, (5 ), 054002  (2013).CrossRef
Zhang  J. L.  et al., “Feasibility study for pseudoscopic problem in integral imaging using negative refractive index materials,” Opt. Express. 22, (17 ), 20757 –20769 (2014). 1094-4087 CrossRef
Yim  J., , Kim  Y. M., and Min  S. W., “Real object pickup method for real and virtual modes of integral imaging,” Opt. Eng.. 53, (7 ), 073109  (2014).CrossRef
Lim  Y. T.  et al., “Analysis on enhanced depth of field for integral imaging microscope,” Opt. Express. 20, (21 ), 23480 –23488 (2012). 1094-4087 CrossRef
Kim  S. K.  et al., “Evaluation of the monocular depth cue in 3D displays,” Opt. Express. 16, (26 ), 21415 –21422 (2008). 1094-4087 CrossRef
Park  J. H.  et al., “Depth-enhanced three-dimensional-two-dimensional convertible display based on modified integral imaging,” Opt. Lett.. 29, (23 ), 2734 –2736 (2004). 0146-9592 CrossRef
Jang  J. S., and Javidi  B., “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett.. 28, (20 ), 1924 –1926 (2003). 0146-9592 CrossRef
Park  C. K., , Lee  S. S., and Hwang  Y. S., “Depth-extended integral imaging system based on a birefringence lens array providing polarization switchable focal lengths,” Opt. Express. 17, (21 ), 19047 –19054 (2009). 1094-4087 CrossRef
Born  M., and Wolf  E., “Resolving power of imaging-forming systems,” in Principles of Optics. , 7th ed., pp. 461 –465,  Cambridge University ,  Cambridge, United Kingdom  (1999).
Steward  G. C., “Various forms of aperture,” in The Symmetrical Optical System. , Hall  P., and Smithies  F., Eds., pp. 88 –102,  Cambridge University ,  London, United Kingdom  (1958).
Welford  W. T., “Use of annular apertures to increase focal depth,” J. Opt. Soc. Am. A. 50, (8 ), 749 –752 (1960). 0740-3232 CrossRef
Martínez-Corral  M.  et al., “Integral imaging with improved depth of field by use of amplitude-modulated microlens arrays,” Appl. Opt.. 43, (31 ), 5806 –5813 (2004). 0003-6935 CrossRef
Luo  C. G.  et al., “Analysis of the depth of field of integral imaging displays based on wave optics,” Opt. Express. 21, (25 ), 31263 –31273 (2013). 1094-4087 CrossRef
Born  M., and Wolf  E., “The eye,” in Principles of Optics. , 7th ed., pp. 261 –263,  Cambridge University ,  Cambridge, United Kingdom  (1999).
Mino  M., and Okano  Y., “Improvement in the OTF of a defocused optical system through the use of shaded apertures,” Appl. Opt.. 10, (10 ), 2219 –2225 (1971). 0003-6935 CrossRef
O’Neill  E. L., “Transfer function for an annular aperture,” J. Opt. Soc. Am. A. 46, (4 ), 285 –288 (1956). 0030-3941 CrossRef

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Related Book Chapters

Topic Collections

PubMed Articles
Advertisement


 

  • Don't have an account?
  • Subscribe to the SPIE Digital Library
  • Create a FREE account to sign up for Digital Library content alerts and gain access to institutional subscriptions remotely.
Access This Article
Sign in or Create a personal account to Buy this article ($20 for members, $25 for non-members).
Access This Proceeding
Sign in or Create a personal account to Buy this article ($15 for members, $18 for non-members).
Access This Chapter

Access to SPIE eBooks is limited to subscribing institutions and is not available as part of a personal subscription. Print or electronic versions of individual SPIE books may be purchased via SPIE.org.