A novel single-pixel fluorescence microscope with super-resolution, based on structured illumination microscopy, has been developed to release the full potential of single-pixel microscopy. By employing enhanced spatially resolved patterns, the proposed microscope measures light intensities using a bucket detector and improves lateral resolution for both brightfield and fluorescence imaging by extending the spatial frequency spectrum. The results demonstrate significant improvements in lateral resolution, highlighting its application in fluorescence microscopy.
We conducted a comprehensive characterization of key phenomena in digital micromirror devices (DMD) using an ultrashort pulse beam of 98.8 fs @800nm. Firstly, we determined the fluence threshold, which was found to be 0.19 J/cm^2. Secondly, we quantified the nonlinear dispersion introduced by the DMD in the pulse beam using the SPIDER method. Our measurements revealed a second order GDD value of 473 fs^2, a third-order dispersion (TOD) of 3700 fs^3, and a fourth-order dispersion (FOD) of -2027000 fs^4. Our research significantly advances our understanding of DMD behavior and its interaction with ultrashort pulses, thereby optimizing their use in optical applications.
A method to increase resolution in single-pixel imaging through parallel implementation based on the selfimaging effect is proposed. The scanning basis, Walsh-Hadamard patterns, are displayed in each unit cell of a 2D binary grating codified on a DMD. The self-imaging effect is used to project the sampling functions onto the object. Images with higher resolution can be obtained by using a light sensor with a low number of pixels and reconstructing by single-pixel technique. This approach can be useful to improve the resolution of IR or THz cameras. Preliminary results are shown.
We propose a new single pixel microscope with optical sectioning properties by using structured illumination techniques. The method uses single-pixel imaging (SPI) techniques by interrogating the sample with a series of spatially resolved patterns and measuring the output intensities with a non-spatial resolution detector. Moreover, optical sectioning is obtained by adding a grating to the system and employing Structured Illumination Microscopy (SIM). The system allows us to perform 3D bright-field and fluorescence microscopy. We apply compressive sensing techniques to decrease the acquisition time.
Imaging through turbid media remains a relevant topic in biomedical imaging. In this contribution, we propose the combination of frequency domain imaging (SFDI) and single-pixel imaging (SPI) to image objects hidden by a turbid media. Firstly, the SFDI method allows to characterize the turbid media by projecting sinusoidal intensity patterns. Secondly, SPI technique provides images of the object through the areas of the turbid media with higher transmission of ballistic photons. The key elements of the system are a DMD to generate the sampling patterns and a LED array working as a programmable light source. Experimental results supporting this idea are shown.
We present a single-pixel spatial frequency domain imaging (SP-SFDI) system where a single DMD (digital micromirror device) is used to modulate simultaneously the sinusoidal pattern for the spatial frequency sampling and the spatial sampling patterns to achieve spatial resolution. The detection system is therefore simplified to the point where it is replaced by an integrating sphere (IS) with a photodiode as a bucket detector. The characterization capabilities of this system are verified by imaging the absorption and reduced scattering coefficient of an inhomogeneous turbid media slab and of several moles of the forearm.
We present a spatial frequency domain imaging (SFDI) system based on single-pixel imaging (SPI) techniques with a single digital micromirror device (DMD) modulating simultaneously the sinusoidal pattern and the spatial sampling masks.
In this contribution, we present an optical imaging system with structured illumination and integrated detection based on the Kubelka-Munk light propagation model for the spatial characterization of scattering and absorption properties of turbid media. The proposed system is based on the application of single-pixel imaging techniques to achieve spatial resolution. Our strategy allows to retrieve images of the absorption and scattering properties of a turbid media slab by using integrating spheres with photodiodes as bucket detectors. We validate our idea by imaging the absorption and scattering coefficients of a spatially heterogeneous phantom and an organic sample.
We present a novel phase imaging system based on a non-interferometric approach to obtain the complex amplitude of a wavefront. We sample the wavefront with a high-speed spatial light modulator. Then, a single lens forms a time-dependent light distribution on its focal plane, where a position detector is placed. This simple setup, with the aid of computational techniques, provides high spatial resolution, power efficiency, high dynamic range, and allows to work in spectral regions outside the VIS range. The validity of the technique is demonstrated by measuring both optical aberrations and phase distributions of transparent samples.
We present a diffuse optical imaging system with structured illumination and integrated detection for spatial characterization of scattering and absorption properties of turbid media. It is based on the application of single- pixel imaging techniques with integrating spheres, which allows us to develop a spatial resolved version of the Kubelka-Munk method.
We present a novel approach for imaging through turbid media that combines the principles of Fourier spatial filtering with single-pixel imaging. We compare the performance of our single-pixel imaging setup with that of a conventional system. We conclude that the introduction of Fourier gating improves the contrast of images in both cases. Furthermore, we show that single-pixel imaging fits better than conventional imaging in vision through turbid media by Fourier filtering.
We present a phase imaging system using a novel non-interferometric approach. We overcome the limitations in spatial resolution, optical efficiency, and dynamic range that are found in Shack-Hartmann sensors. To do so, we sample the wavefront using a digital micromirror device. A single lens forms a time-dependent light distribution on its focal plane, where a position detector is placed. Our approach is lenslet-free and does not use any kind of iterative or unwrap algorithm to recover the phase information. The validity of our approach is demonstrated by performing both aberration sensing and phase imaging of transparent samples.
We present a novel imaging system that combines the principles of Fourier spatial filtering and single-pixel imaging in order to recover images of an object hidden behind a turbid medium by transillumination. We compare the performance of our single-pixel imaging setup with that of a conventional system. We conclude that the introduction of Fourier gating improves the contrast of images in both cases. Furthermore, we show that the combination of single-pixel imaging and Fourier spatial filtering techniques is particularly well adapted to provide images of objects transmitted through scattering media.
We perform phase imaging using a non-interferometric approach to measure the complex amplitude of a wavefront. We overcome the limitations in spatial resolution, optical efficiency, and dynamic range that are found in Shack-Hartmann wavefront sensing. To do so, we sample the wavefront with a high-speed spatial light modulator. A single lens forms a time-dependent light distribution on its focal plane, where a position detector is placed. Our approach is lenslet-free and does not rely on any kind of iterative or unwrap algorithm. The validity of our technique is demonstrated by performing both aberration sensing and phase imaging of transparent samples.
Over the past decade, single-pixel imaging (SPI) has established as a viable tool in scenarios where traditional imaging techniques struggle to provide images with acceptable quality in practicable times and reasonable costs. However, SPI still has several limitations inherent to the technique, such as working with spurious light and in real time. Here we present a novel approach, using complementary measurements and a single balanced detector. By using balanced detection, we improve the frame rate of the complementary measurement architectures by a factor of two. Furthermore, the use of a balanced detector provides environmental light immunity to the method.
We present a novel approach for imaging through scattering media by combining single-pixel imaging techniques and Fourier spatial filtering. Experimental improvements in both penetration depth and spatial resolution of the acquired images are shown.
Single-pixel detection approaches have been applied with success in different imaging
techniques such as optical microscopy. Here we show that it is possible to improve the resolution of
single-pixel microscopy by using an array of photodetectors and Fourier ptychography algorithms.
The complete phase and amplitude information of biological specimens can be easily determined by phase-shifting digital holography. Spatial light modulators (SLMs) based on liquid crystal technology, with a frame-rate around 60 Hz, have been employed in digital holography. In contrast, digital micro-mirror devices (DMDs) can reach frame rates up to 22 kHz. A method proposed by Lee to design computer generated holograms (CGHs) permits the use of such binary amplitude modulators as phase-modulation devices. Single-pixel imaging techniques record images by sampling the object with a sequence of micro-structured light patterns and using a simple photodetector. Our group has reported some approaches combining single-pixel imaging and phase-shifting digital holography. In this communication, we review these techniques and present the possibility of a high-speed single-pixel phase-shifting digital holography system with phase-encoded illumination. This system is based on a Mach-Zehnder interferometer, with a DMD acting as the modulator for projecting the sampling patterns on the object and also being used for phase-shifting. The proposed sampling functions are phaseencoded Hadamard patterns generated through a Lee hologram approach. The method allows the recording of the complex amplitude distribution of an object at high speed on account of the high frame rates of the DMD. Reconstruction may take just a few seconds. Besides, the optical setup is envisaged as a true adaptive system, which is able to measure the aberration induced by the optical system in the absence of a sample object, and then to compensate the wavefront in the phasemodulation stage.
In the last years, single-pixel imaging (SPI) was established as a suitable tool for non-invasive imaging of an absorbing object completely embedded in an inhomogeneous medium. One of the main characteristics of the technique is that it uses very simple sensors (bucket detectors such as photodiodes or photomultiplier tubes) combined with structured illumination and mathematical algorithms to recover the image. This reduction in complexity of the sensing device gives these systems the opportunity to obtain images at shallow depth overcoming the scattering problem. Nonetheless, some challenges, such as the need for improved signal-to-noise or the frame rate, remain to be tackled before extensive use in practical systems. Also, for intact or live optically thick tissues, epi-detection is commonly used, while present implementations of SPI are limited to transillumination geometries.
In this work we present new features and some recent advances in SPI that involve either the use of computationally efficient algorithms for adaptive sensing or a balanced detection mechanism. Additionally, SPI has been adapted to handle reflected light to create a double pass optical system. Such developments represent a significant step towards the use of SPI in more realistic scenarios, especially in biophotonics applications. In particular, we show the design of a single-pixel ophtalmoscope as a novel way of imaging the retina in real time.
We describe a method to image objects through scattering media based on single-pixel detection and microstructured illumination. Spatial light modulators are used to project a set of microstructured light patterns onto the sample. The image is retrieved computationally from the photocurrent fluctuations provided by a single-pixel detector. This technique does not require coherent light, raster scanning, time-gated detection or a-priori calibration process. We review several optical setups developed by our research group in the last years with particular emphasis in a new optical system based on a double-pass configuration and in the combination of single-pixel imaging with Fourier filtering.
Imaging systems based on microstructured illumination and single-pixel detection offer several advantages over conventional imaging techniques. They are an effective method for imaging through scattering media even in the dynamic case. They work efficiently under low light levels, and the simplicity of the detector makes it easy to design imaging systems working out of the visible spectrum and to acquire multidimensional information. In particular, several approaches have been proposed to record 3D information. The technique is based on sampling the object with a sequence of microstructured light patterns codified onto a programmable spatial light modulator while light intensity is measured with a single-pixel detector. The image is retrieved computationally from the photocurrent fluctuations provided by the detector. In this contribution we describe an optical system able to produce full-color stereoscopic images by using few and simple optoelectronic components. In our setup we use an off-the-shelf digital light projector (DLP) based on a digital micromirror device (DMD) to generate the light patterns. To capture the color of the scene we take advantage of the codification procedure used by the DLP for color video projection. To record stereoscopic views we use a 90° beam splitter and two mirrors, allowing us two project the patterns form two different viewpoints. By using a single monochromatic photodiode we obtain a pair of color images that can be used as input in a 3-D display. To reduce the time we need to project the patterns we use a compressive sampling algorithm. Experimental results are shown.
There are several ophthalmic devices to image the retina, from fundus cameras capable to image the whole fundus to
scanning ophthalmoscopes with photoreceptor resolution. Unfortunately, these devices are prone to a variety of ocular
conditions like defocus and media opacities, which usually degrade the quality of the image. Here, we demonstrate a
novel approach to image the retina in real-time using a single pixel camera, which has the potential to circumvent those
optical restrictions. The imaging procedure is as follows: a set of spatially coded patterns is projected rapidly onto the
retina using a digital micro mirror device. At the same time, the inner product’s intensity is measured for each pattern
with a photomultiplier module. Subsequently, an image of the retina is reconstructed computationally. Obtained image
resolution is up to 128 x 128 px with a varying real-time video framerate up to 11 fps. Experimental results obtained in
an artificial eye confirm the tolerance against defocus compared to a conventional multi-pixel array based system.
Furthermore, the use of a multiplexed illumination offers a SNR improvement leading to a lower illumination of the eye
and hence an increase in patient’s comfort. In addition, the proposed system could enable imaging in wavelength ranges
where cameras are not available.
In this work we have developed a single-pixel optical microscope that provides both re ection and transmission images of the sample under test by attaching a diamond pixel layout DMD to a commercial inverted microscope. Our system performs simultaneous measurements of re ection and transmission modes. Besides, in contrast with a conventional system, in our single-element detection system both images belong, unequivocally, to the same plane of the sample. Furthermore, we have designed an algorithm to modify the shape of the projected patterns that improves the resolution and prevents the artifacts produced by the diamond pixel architecture.
One challenge that has long held the attention of scientists is that of clearly seeing objects hidden by turbid media, as smoke, fog or biological tissue, which has major implications in fields such as remote sensing or early diagnosis of diseases. Here, we combine structured incoherent illumination and bucket detection for imaging an absorbing object completely embedded in a scattering medium. A sequence of low-intensity microstructured light patterns is launched onto the object, whose image is accurately reconstructed through the light fluctuations measured by a single-pixel detector. Our technique is noninvasive, does not require coherent sources, raster scanning nor time-gated detection and benefits from the compressive sensing strategy. As a proof of concept, we experimentally retrieve the image of a transilluminated target both sandwiched between two holographic diffusers and embedded in a 6mm-thick sample of chicken breast.
Precise control of light propagation through highly scattering media is a much desired goal with major technological implications. Since interaction of light with turbid media results in partial or complete depletion of ballistic photons, it is in principle impossible to transmit images through distances longer than the extinction length. In biomedical optics, scattering is the dominant light extinction process accounting almost exclusively for the limited imaging depth range. In addition, most scattering media of interest are dynamic in the sense that the scatter centers continuously change their positions with time. In our work, we employ single-pixel systems, which can overcome the fundamental limitations imposed by multiple scattering even in the dynamically varying case. A sequence of microstructured light patterns codified onto a programmable spatial light modulator are used to sample an object and measurements are captured with a single-pixel detector. Acquisition time is reduced by using compressive sensing techniques. The patterns are used as generalized measurement modes where the object information is expressed. Contrary to the techniques based on the transmission matrix, our approach does not require any a-priori calibration process. The presence of a scattering medium between the object and the detector scrambles the light and mixes the information from all the regions of the sample. However, the object information that can be retrieved from the generalized modes is not destroyed. Furthermore, by using these techniques we have been able to tackle the general problem of imaging objects completely embedded in a scattering medium.
Despite imaging systems that scan a single-element benefit from mature technology, they suffer from acquisition times linearly proportional to the spatial resolution. A promising option is to use a single-pixel system that benefits from data collection strategies based on compressive sampling. Single-pixel systems also offer the possibility to use dedicated sensors such as a fiber spectrometer for multispectral imaging or a distribution of photodiodes for 3D imaging. The image is obtained by lighting the scene with microstructured masks implemented onto a programmable spatial light modulator. The masks are used as generalized measurement modes where the object information is expressed and the image is recovered through algebraic optimization. The fundamental reason why the bucket detection strategy can outperform conventional optical array detection is the use of a single channel detector that simultaneously integrates all the photons transmitted through the patterned scene. Spatial frequencies that are not transmitted through this low-quality optics are demonstrated to be present in the retrieved image. Our work makes two specific contributions within the field of single-pixel imaging through patterned illumination. First, we demonstrate that single-pixel imaging improves the resolution of conventional imaging systems overcoming the Rayleigh criterion. An analysis of resolution using a low NA microscope objective for imaging at a CCD camera shows that single-pixel cameras are not limited at all by the optical quality of the collection optics. Second, we experimentally demonstrate the capability of our technique to properly recover an image even when an optical diffuser is located in between the sample and the bucket detector.
In computational imaging by pattern projection a sequence of microstructured light patterns codified onto a programmable spatial light modulator is used to sample an object. The patterns are used as generalized measurement modes where the object information is expressed. Our paper makes two specific contributions within the field of single-pixel imaging through patterned illumination. First, we perform an analysis of the optical resolution of the computational image. This resolution is shown not to be limited at all by the optical quality of the collection optics. This result is proved by using a low NA microscope objective for imaging at a CCD camera. Spatial frequencies that are not transmitted through this low quality optics are demonstrated to be present in the retrieved image through patterned illumination. Second, we experimentally demonstrate the capability of our technique to properly recover an image even when an optical diffuser is located in between the sample and the single-pixel detector.
We have applied an active methodology to pre-service teacher training courses and to active teacher workshops on Optics. As a practical resource, a set of demonstrations has been used to learn how to perform classroom demonstrations. The set includes experiments about polarization and birefringence, optical information transmission, diffraction, fluorescence or scattering. It had been prepared for Science popularization activities and has been employed in several settings with a variety of audiences. In the teacher training sessions, simple but clarifying experiments have been performed by all the participants. Moreover, in these workshops, devices or basic set-ups, like the ones included in our demonstration set, have been built. The practical approach has allowed the enthusiastic sharing of teaching and learning experiences among the workshop participants. We believe that such an active orientation in teacher training courses promotes the active and collaborative teaching and learning of Optics in different levels of Education.
We demonstrate the utilization of Dammann lenses encoded onto a spatial light modulator (SLM) for triggering nonlinear effects. For continuous illumination Dammann lenses generate a multifocal pattern characterized by a set of N foci diffraction orders, all with the same intensity. We theoretically show that for pulses shorter than 100 femtosecond (fs) the effects of chromatic aberrations influence the uniformity of the generated pattern. Multifocal second harmonic generation (SHG) and on-axis multiple filamentation are produced and actively controlled in β-BaB2O4 (BBO) and fused silica samples, respectively, with an amplified Ti:Sapphire femtosecond laser (30 fs at FWHM). Our proposal allows us to dynamically control both the quantity of foci and the distance among them. The output diffraction pattern is in good agreement with theoretical calculations. The measured spectra at the rear face of the supercontinuum sample for different separation among foci are also provided. The potential of this technique is very promising in different fields of nonlinear optics or in applications of in-depth materials microprocessing.
Increment of the axial region of ablation in a micromachining process is demonstrated when a refractive lens (RL) is replaced by a diffractive lens (DL). The depth of focus of a DL and a RL with the same numerical aperture are compared. For ultrashort pulses, the broadband spectrum of the laser together with the chromatic aberrations associated to the DL made its ablation region greater than the one corresponding to a RL. We measure experimentally the tridimensional ablation region for both types of lenses with 100 fs and 30 fs pulses. This study is expected to be promising to alleviate the mechanical tolerances in femtosecond micromachining with diffractive optical elements (DOEs).
We describe and theoretically analyze the self-imaging Talbot effect of entangled photon pairs in the time domain.
Rich phenomena are observed in coherence propagation along dispersive media of mode-locked two-photon
states with frequency entanglement exhibiting a comblike correlation function. The observed effect suggests a
straightforward and implementable way to transfer remotely frequency standards embedded on ultracompact
quantum light sources.
We theoretically demonstrate by using fractional calculus tools the linking between the instantaneous phase profile of a
given temporal optical pulse and its photonics semi-differintegration, i.e. a 0.5th-order differentiation/integration. In both
cases, the signal's temporal phase can be retrieved by simple dividing two temporal intensity profiles, namely the
intensities of the input and output pulses of a spectrally-shifted 0.5th-order differentiator/integrator. In both cases, we
obtained simple analytical expressions for the instantaneous frequency profile. We numerically prove the viability of
these proposals.
Here we numerically demonstrated that the phase profile of a given temporal optical pulse can be retrieved by photonic
semi-differintegration, where by semi-differintegration we mean either a 0.5th-order differentiation or integration. In
both cases, the signal's temporal phase can be obtained by simple dividing two temporal intensity profiles, namely the
intensities of the input and output pulses of a spectrally-shifted semi-differintegral. In both cases, we obtained simple
analytical expressions for the phase profile. We numerically prove the viability of these proposals.
Determining the precise location of irradiance centroids is a key step for optical triangulation and wavefront sensing
based on wavefront slope measurements (as e.g. in Hartmann-Shack aberrometry). Since most aberrometers include
some kind of optical relay system to reimage the irradiance distributions provided by the wavefront sampling element
onto the irradiance detector, it is esential to ensure that the centroid position and momentum information is preserved
along this operation. In optical systems with ABCD difrraction kernels the centroids propagate according to an effective
geometrical optics rule. However, the presence of finite apertures partially blocking the incoming beam or non-uniform
transmittances unevenly altering its original irradiance distribution may give rise to potentially significant departures
from this simple geometrical picture. The potential magnitude of this bias makes it advisable to take proper steps to
counteract it in the design of aberrometric setups.
In this contribution we describe a method for achieving a phase-only modulation regime with an off-the-shelf twisted
nematic liquid crystal display (TNLCD). The keystone of this procedure involves illumination of an addressed TNLCD
with circularly polarized light. The analysis of the distribution of the output polarization states in the S1-S2 plane as the
applied voltage is changed suggests a simple way to optimize the liquid crystal phase response. For this purpose, a
properly oriented quarter-wave plate followed by an analyzer is used behind the TNLCD. Laboratory results for a
commercial display are presented. Our experiments show a phase modulation depth of 240º for a wavelength of 514 nm
with a residual intensity variation lower than 4%.
In the last years, many efforts have been devoted to use electrically addressed spatial light modulators (SLMs) in
Adaptative Optics. In this contribution we have optimized a low-cost SLM based on a liquid crystal (LC) device for the
compensation of eye aberrations. This kind of devices is seldom used in ophtalmic applications due to the relatively low
dynamic range of the phase retardation that can be introduced at each pixel. Here, we have optimized the phase
modulation response of a commercial twisted nematic liquid crystal display (TNLCD) by means of a polarimetric
arrangement that includes retarder plates and polarizers. Furthermore, we describe an efficient four-level phase encoding
scheme that allows us to use these conventional SLMs for the compensation of optical aberrations as those typically
found in human eyes. For obtaining experimental compensation results we have used artificial aberrated eyes simulated
with refractive phase plates. This proof-of-concept is the first step to develop a low-cost real-time system for the
correction of eye aberrations.
We present an achromatization procedure for multiple images obtained with the Talbot illuminator. It consists on joining in
one optical arrangement an achromatic Fresnel diffraction setup and the kinoform Talbot illuminator. In this way the multiple
images produced by the Talbot illuminator are obtained using totally incoherent light, both spatially as well as temporally.
We present the experimental results which confirm the correctness of the proposed approach.
The equivalence between a twisted-nematic liquid crystal cell and the combination of a retardation wave-plate and a polarization rotator can be used to calibrate a voltage-addressed liquid crystal display. We present a simple polarimetric procedure to determine the two parameters that define the optical properties of the equivalent retarder-rotator system for each value of the applied voltage. Once the calibration procedure is performed, the optical response of the liquid crystal cell can be predicted and optimized. In particular, we demonstrate the generation of a family of equi-azimuth polarization states with a liquid crystal display sandwiched by a polarizer and a quarter-wave plate, whose optimal orientations are evaluated by a numerical simulation. Laboratory results corresponding to a commercial liquid crystal display are pre-sented.
We describe some optical combinations of refractive and diffractive lenses to compensate for the inherent wavelength dispersion shown by interference and diffraction patterns under white-light point-source illumination. In a second phase, this achromatic behavior is also applied to the case of spatially incoherent polychromatic light, i.e., to totally incoherent illumination. Finally, the above results are extended to the correction of chromatic distortion associated with diffraction of femtosecond pulse light. Several experimental results are shown.
The apodization of diffractive optical elements can be realized by a local change of their diffraction efficiency. In the case of lithographic elements with step-like structure of the period, the variable diffraction efficiency can be achieved by a gradual transformation of the 2m step kinoform into its conjugate counterpart across the apodization region. In the present contribution we show experimental results confirming this idea, which until now was verified only by simulations. The apodized quaternary grating with locally varying diffraction efficiency was obtained on a SLM device as a programmable diffractive optical element by changing gradually the period's profile. Knowledge of the phase heights of the SLM's pixels is required for successful implementation of the apodization function. It was determined from Fresnel images of the binary phase gratings with different phase step height programmed on the SLM. The Fresnel images become then binary and their visibility depends on the phase height of the grating in a known way, what makes possible to calibrate the SLM.
We describe several methods to extend security techniques based on optical processing to work under broadband illumination. The key question of our procedures is the design of dispersion-compensated optical processors by combining a small number of diffractive and refractive lenses. Our optical configurations provide, in a first-order approximation, the Fraunhofer diffraction pattern of the input signal in a single plane and with the same scale for all the wavelengths of the incident light. In this way, our achromatic hybrid systems allow us to reconstruct color holograms with white light. These achromatic hybrid (diffractive-refractive) systems are applied, in a second stage, for implementing color processing operations with white light, such as color pattern recognition. In this direction, we design also a technique to encrypt color input objects into computer generated color holograms, which are decrypted optically with an achromatic joint transform correlator architecture under white-light illumination. Finally, we describe a totally-incoherent optical processor that is able to perform color processing operations under natural illumination (both spatially and temporally incoherent). This system is applied to perform color pattern recognition and optical encryption operations under natural light. Numerical and experimental results are shown.
We report on a hybrid (diffractive-refractive) wavelength-independent imaging setup with an intermediate achromatic filtering plane in the Fresnel domain. Therefore, the system acts as a chromatically-compensated Fresnel processor able to perform space-variant color pattern recognition operations in a single step.
KEYWORDS: Colorimetry, Diffraction gratings, Lenses, Diffraction, Near field diffraction, Objectives, Light sources, Local area networks, Chemical elements, Wave propagation
A novel optical set-up that allows a totally incoherent Lau effect is demonstrated. It is based on dispersion-compensated techniques that employ strong dispersive elements. In this way, three commercially available diffractive lenses and a refractive objective are used for achromatic Lau fringes production with spatial and temporally incoherent illumination.
Diffraction-based optical correlators working under broadband illumination, in contrast to their coherent counterparts, allow us to exploit color information. However, the use of the wavelength as an additional parameter requires to take into account the chromatic dispersion inherent to the diffraction process. In this contribution, we describe a novel family of dispersion-compensated broadband optical correlators that operate some in the Fourier and some in the Fresnel region. In both cases, the chromatic compensation is achieved by a proper combination of a small number of commercially available optical elements, conventional diffractive and refractive lenses. In all cases, and with a single matched filter, the chromatic content of the correlation peak provides the spectral composition of the detecting color signal. On top of that, some of our optical solutions work with point-source illumination and others with spatially-incoherent light. In this way, on the one hand, our spatially coherent optical designs permit to perform the color correlation in amplitude for each spectral channel. Accordingly, working in the Fresnel domain, we achieve a space-variant color pattern recognition setup. On the other hand, totally incoherent optical correlators, which are linear in irradiance, provide important practical advantages as they employ natural light and allow us to deal with diffuse, reflecting or self-luminous color objects.
We report herein an hybrid (diffractive-refractive) lens triplet showing quasi-wavelength-independent optical Fourier transform capabilities. The wavelength compensation carried out by our novel optical design is exact for the axial position of the Fourier transform of the input. Nevertheless, a very low residual transversal chromatic aberration remains. Results of laboratory experiments will be shown.
We report a new achromatic Fourier processor constituted basically by a quasi wavelength- independent imaging forming system whose first half performs an achromatic Fourier transform of a color input object. Consequently, this optical architecture, formed by a small number of diffractive and refractive lenses, provides an intermediate achromatic real Fraunhofer plane and a final color image with a high signal-to-noise ratio. In this way, our optical processor can perform simultaneously the same spatial filtering operation for all the spectral components of the broadband illumination.
The chromatic blurring in the recording of the joint power spectrum in the first stage of a white-light joint transform correlator experiment can be avoided by using an achromatic Fourier transformer. In this way, we propose a novel colour pattern recognition technique that is based on an achromatic joint transform correlator architecture. Experimental results of our procedure are also shown.
We report two achromatic Fourier transform setups working under broadband converging spherical wave illumination. Both optical configurations provide the achromatic Fraunhofer diffraction pattern of any colour pupil with adjustable magnification, and with low geometrical residual chromatic aberrations even for white-light illumination. Results of laboratory experiments will be shown.
Earlier treatments of scalar and electromagnetic focusing problems have been based on either the Debye or the Kirchhoff approximation. Recently, the Kirchhoff theory has been shown to be superior to the Debye theory at low Fresnel numbers, but further theoretical tests of either theory are hampered by the lack of exact solutions. Therefore, exact solutions are constructed in this paper for the focusing of 2D electromagnetic waves through a slit in a perfectly conducting screen.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.