Multispectral fluorescence lifetime imaging microscopy (λFLIM) is a powerful optical technique to investigate biological processes, which generally requires long acquisition time. Single Pixel Camera (SPC) is an imaging architecture base on Compressive Sensing (CS) techniques which allows to strongly reduce the acquisition time while preserving the information content at the cost of an increased computational time. In this work we present a λFLIM microscope based on CS-SPC architecture. We have tested the multiscale capability of the system by merging SPC zooming with data fusion and proposed a fast fitting framework, which runs in parallel with the acquisition, allowing a fast visualization.
Multispectral fluorescence lifetime microscopy (FLIM) is a valuable tool for biomedical and environmental applications. A multidimensional acquisition scheme (space, time, spectrum) provides high information content and the drawback of long acquisition/processing times. Compressive Sensing (CS) combined with Single-Pixel Camera (SPC) acquisition scheme has been proposed as a strategy to reduce the number of measurements. We present a multispectral FLIM system based on SPC, CS and data fusion (DF) with a high-resolution camera to strongly reduce the acquisition time. We adopted a novel method for TCSPC to increase the count-rate. The system is characterized and validated on a cellular sample.
Time-resolved multispectral fluorescence microscopy provides a 4D hypercube dataset with high specificity for cellular examination. However, this is generally obtained by significantly increasing the measurement time, which is quite limiting for in vivo measurement or with photosensitive sample. It is possible to reduce the measurement effort with a novel microscopy framework exploiting compressive sensing based on single-pixel camera. In this work, we present a compressive sensing system and validate it with a cellular sample. Data fusion with a high-resolution camera image allows us to tackle the well-known problem of low-resolution in single-pixel imaging.
Light scattering has proven to be a hard limitation in a wide range of sensing applications, such as astronomical or biological imaging. In microscopy systems, the random perturbations introduced to the wavefront limit the achievable spatial resolution and imaging depth. In the past, several methods have been proposed to control how light interacts with the medium, allowing focusing and imaging through multiple scattering media by using wavefront shaping techniques. However, non-invasively imaging objects behind scattering media over large fields of view remains a challenging feat.
Here, we introduce a novel approach that allows to recover fluorescent extended objects behind scattering layers well beyond the optical memory effect (ME) range without the use of neither adaptive optics nor wavefront shaping techniques. To do so, we project a collection of unknown random illumination speckle patterns through the scattering medium by using a simple rotating diffuser. For each position of the rotating diffuser, a different incoherent sum of speckle patterns is recorded by the camera. Even though these images are low-contrast, random, and seem to carry no information at all, they contain the information about the position of the emitters. Here we show that, if enough images are measured, it is possible to use Non-negative Matrix Factorization to demix all the information and to retrieve the relative position of each fluorescent emitter in the sample.
As a proof of the technique, we show experimental results with both sparse and continuous objects, covering fields-of-view of up to three times the optical memory effect range
We present a technique for multispectral fluorescence lifetime imaging with high spatial resolution by combining both single-pixel and data fusion imaging techniques. The system relies on the combined use of three different sensors: two SP cameras capturing multispectral and time-resolved information, and a conventional 2D array detector capturing high spatial resolution images The resultant giga-voxel 4D hypercube is acquired in a fast manner measuring only 0.03% of the dataset. The fusion procedure is done by solving a regularization problem which is efficiently solved via gradient descent. The system can be used to identify fluorophore species.
We present a technique to capture high spatial resolution, multispectral, and time-resolved fluorescence images by combining both single-pixel and data fusion imaging techniques. The resultant 4D hypercube can be used to identify fluorophore species.
We present a novel phase imaging system based on a non-interferometric approach to obtain the complex amplitude of a wavefront. We sample the wavefront with a high-speed spatial light modulator. Then, a single lens forms a time-dependent light distribution on its focal plane, where a position detector is placed. This simple setup, with the aid of computational techniques, provides high spatial resolution, power efficiency, high dynamic range, and allows to work in spectral regions outside the VIS range. The validity of the technique is demonstrated by measuring both optical aberrations and phase distributions of transparent samples.
The high data bandwidth of Raman imaging precludes high-speed spectroscopic imaging. Conversely, emerging compressive sensing hyperspectroscopy techniques could, in principle, address this issue by using undersampling methodologies with computational reconstructions. However, compressive spectrometer layouts have prohibitive losses for low-light levels applications, such as in the spontaneous Raman imaging of dynamic biological specimens. These losses are due to the fact that high-sensitivity light detectors (photo-counters) have too small active area (typically 100 um) compared to the size of digital micromirror devices (DMD) (~10 mm) used in most compressive layouts. Inspired by pulse shaping techniques of ultrafast spectroscopy, we present a new programmable spectrometer layout with high-throughput and large spectral coupling bandwidths. Exploiting amplitude spectral modulation with DMD allows conventional and compressive Raman imaging and spectroscopy acquisitions with shot-noise-limited sensitivity. With this spectrometer, we demonstrate compressed hyperspectroscopy at faster speeds and at lower costs than traditional cameras used in Raman imaging applications. We showcase imaging of biological specimens at high spatial resolution (250 nm).
We present a phase imaging system using a novel non-interferometric approach. We overcome the limitations in spatial resolution, optical efficiency, and dynamic range that are found in Shack-Hartmann sensors. To do so, we sample the wavefront using a digital micromirror device. A single lens forms a time-dependent light distribution on its focal plane, where a position detector is placed. Our approach is lenslet-free and does not use any kind of iterative or unwrap algorithm to recover the phase information. The validity of our approach is demonstrated by performing both aberration sensing and phase imaging of transparent samples.
We perform phase imaging using a non-interferometric approach to measure the complex amplitude of a wavefront. We overcome the limitations in spatial resolution, optical efficiency, and dynamic range that are found in Shack-Hartmann wavefront sensing. To do so, we sample the wavefront with a high-speed spatial light modulator. A single lens forms a time-dependent light distribution on its focal plane, where a position detector is placed. Our approach is lenslet-free and does not rely on any kind of iterative or unwrap algorithm. The validity of our technique is demonstrated by performing both aberration sensing and phase imaging of transparent samples.
Over the past decade, single-pixel imaging (SPI) has established as a viable tool in scenarios where traditional imaging techniques struggle to provide images with acceptable quality in practicable times and reasonable costs. However, SPI still has several limitations inherent to the technique, such as working with spurious light and in real time. Here we present a novel approach, using complementary measurements and a single balanced detector. By using balanced detection, we improve the frame rate of the complementary measurement architectures by a factor of two. Furthermore, the use of a balanced detector provides environmental light immunity to the method.
We present a novel approach for imaging through scattering media by combining single-pixel imaging techniques and Fourier spatial filtering. Experimental improvements in both penetration depth and spatial resolution of the acquired images are shown.
The complete phase and amplitude information of biological specimens can be easily determined by phase-shifting digital holography. Spatial light modulators (SLMs) based on liquid crystal technology, with a frame-rate around 60 Hz, have been employed in digital holography. In contrast, digital micro-mirror devices (DMDs) can reach frame rates up to 22 kHz. A method proposed by Lee to design computer generated holograms (CGHs) permits the use of such binary amplitude modulators as phase-modulation devices. Single-pixel imaging techniques record images by sampling the object with a sequence of micro-structured light patterns and using a simple photodetector. Our group has reported some approaches combining single-pixel imaging and phase-shifting digital holography. In this communication, we review these techniques and present the possibility of a high-speed single-pixel phase-shifting digital holography system with phase-encoded illumination. This system is based on a Mach-Zehnder interferometer, with a DMD acting as the modulator for projecting the sampling patterns on the object and also being used for phase-shifting. The proposed sampling functions are phaseencoded Hadamard patterns generated through a Lee hologram approach. The method allows the recording of the complex amplitude distribution of an object at high speed on account of the high frame rates of the DMD. Reconstruction may take just a few seconds. Besides, the optical setup is envisaged as a true adaptive system, which is able to measure the aberration induced by the optical system in the absence of a sample object, and then to compensate the wavefront in the phasemodulation stage.
We describe a method to image objects through scattering media based on single-pixel detection and microstructured illumination. Spatial light modulators are used to project a set of microstructured light patterns onto the sample. The image is retrieved computationally from the photocurrent fluctuations provided by a single-pixel detector. This technique does not require coherent light, raster scanning, time-gated detection or a-priori calibration process. We review several optical setups developed by our research group in the last years with particular emphasis in a new optical system based on a double-pass configuration and in the combination of single-pixel imaging with Fourier filtering.
One challenge that has long held the attention of scientists is that of clearly seeing objects hidden by turbid media, as smoke, fog or biological tissue, which has major implications in fields such as remote sensing or early diagnosis of diseases. Here, we combine structured incoherent illumination and bucket detection for imaging an absorbing object completely embedded in a scattering medium. A sequence of low-intensity microstructured light patterns is launched onto the object, whose image is accurately reconstructed through the light fluctuations measured by a single-pixel detector. Our technique is noninvasive, does not require coherent sources, raster scanning nor time-gated detection and benefits from the compressive sensing strategy. As a proof of concept, we experimentally retrieve the image of a transilluminated target both sandwiched between two holographic diffusers and embedded in a 6mm-thick sample of chicken breast.
Precise control of light propagation through highly scattering media is a much desired goal with major technological implications. Since interaction of light with turbid media results in partial or complete depletion of ballistic photons, it is in principle impossible to transmit images through distances longer than the extinction length. In biomedical optics, scattering is the dominant light extinction process accounting almost exclusively for the limited imaging depth range. In addition, most scattering media of interest are dynamic in the sense that the scatter centers continuously change their positions with time. In our work, we employ single-pixel systems, which can overcome the fundamental limitations imposed by multiple scattering even in the dynamically varying case. A sequence of microstructured light patterns codified onto a programmable spatial light modulator are used to sample an object and measurements are captured with a single-pixel detector. Acquisition time is reduced by using compressive sensing techniques. The patterns are used as generalized measurement modes where the object information is expressed. Contrary to the techniques based on the transmission matrix, our approach does not require any a-priori calibration process. The presence of a scattering medium between the object and the detector scrambles the light and mixes the information from all the regions of the sample. However, the object information that can be retrieved from the generalized modes is not destroyed. Furthermore, by using these techniques we have been able to tackle the general problem of imaging objects completely embedded in a scattering medium.
Despite imaging systems that scan a single-element benefit from mature technology, they suffer from acquisition times linearly proportional to the spatial resolution. A promising option is to use a single-pixel system that benefits from data collection strategies based on compressive sampling. Single-pixel systems also offer the possibility to use dedicated sensors such as a fiber spectrometer for multispectral imaging or a distribution of photodiodes for 3D imaging. The image is obtained by lighting the scene with microstructured masks implemented onto a programmable spatial light modulator. The masks are used as generalized measurement modes where the object information is expressed and the image is recovered through algebraic optimization. The fundamental reason why the bucket detection strategy can outperform conventional optical array detection is the use of a single channel detector that simultaneously integrates all the photons transmitted through the patterned scene. Spatial frequencies that are not transmitted through this low-quality optics are demonstrated to be present in the retrieved image. Our work makes two specific contributions within the field of single-pixel imaging through patterned illumination. First, we demonstrate that single-pixel imaging improves the resolution of conventional imaging systems overcoming the Rayleigh criterion. An analysis of resolution using a low NA microscope objective for imaging at a CCD camera shows that single-pixel cameras are not limited at all by the optical quality of the collection optics. Second, we experimentally demonstrate the capability of our technique to properly recover an image even when an optical diffuser is located in between the sample and the bucket detector.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.