The defense, intelligence, and homeland security communities are driving a need for software dominant, real-time or
near-real time atmospheric turbulence compensated imagery. The development of parallel processing capabilities
are finding application in diverse areas including image processing, target tracking, pattern recognition, and image
fusion to name a few. A novel approach to the computationally intensive case of software dominant optical and near
infrared imaging through atmospheric turbulence is addressed in this paper. Previously, the somewhat conventional
wavelength diversity method has been used to compensate for atmospheric turbulence with great success. We apply
a new correlation based approach to the wavelength diversity methodology using a parallel processing architecture
enabling high speed atmospheric turbulence compensation. Methods for optical imaging through distributed
turbulence are discussed, simulation results are presented, and computational and performance assessments are
provided.
Phase diversity imaging methods work well in removing atmospheric turbulence and some system effects from predominantly near-field imaging systems. However, phase diversity approaches can be computationally intensive and slow. We present a recently adapted, high-speed phase diversity method using a conventional, software-based neural network paradigm. This phase-diversity method has the advantage of eliminating many time consuming, computationally heavy calculations and directly estimates the optical transfer function from the entrance pupil phases or phase differences. Additionally, this method is more accurate than conventional Zernike-based, phase diversity approaches and lends itself to implementation on parallel software or hardware architectures. We use computer simulation to demonstrate how this high-speed, phase diverse imaging method can be implemented on a parallel, highspeed, neural network-based architecture-specifically the Cellular Neural Network (CNN). The CNN architecture was chosen as a representative, neural network-based processing environment because 1) the CNN can be implemented in 2-D or 3-D processing schemes, 2) it can be implemented in hardware or software, 3) recent 2-D implementations of CNN technology have shown a 3 orders of magnitude superiority in speed, area, or power over equivalent digital representations, and 4) a complete development environment exists. We also provide a short discussion on processing speed.
The well known phase diversity technique has long been used as a premier passive imaging method to mitigate the
degrading effects of atmospheric turbulence on incoherent optical imagery. Typically, an iterative, slow method is
applied that uses the Zernike basis set and 2-D Fourier transforms in the reconstruction process. In this paper, we
demonstrate a direct method for estimating the un-aberrated object brightness from phase or phase difference estimates
that 1) does not require the use of the Zernike basis set or the intermediate determination of the generalized pupil
function, 2) directly determines the optical transfer function without the requirement for an iterative sequence of 2-D
Fourier Transforms, 3) provides a more accurate result than the Zernike-based approaches since there are no Zernike
series truncation errors, 4) lends itself to fast and parallel implementation, and 5) can use stochastic search methods to
rapidly determine simultaneous phases or phase differences required to determine the correct optical transfer function
estimate. As such, this new implementation of phase diversity provides potentially faster, more accurate results than
previous approaches yet still retains inherent compatibility with the traditional Zernike-based methods. The theoretical
underpinnings of this new method along with demonstrative computer simulation results are presented.
The advancement of neural network methods and technologies is finding applications in many fields and disciplines of
interest to the defense, intelligence, and homeland security communities. Rapidly reconfigurable sensors for real or
near-real time signal or image processing can be used for multi-functional purposes such as image compression, target
tracking, image fusion, edge detection, thresholding, pattern recognition, and atmospheric turbulence compensation to
name a few. A neural network based smart sensor is described that can accomplish these tasks individually or in
combination, in real-time or near real-time. As a computationally intensive example, the case of optical imaging
through volume turbulence is addressed. For imaging systems in the visible and near infrared part of the
electromagnetic spectrum, the atmosphere is often the dominant factor in reducing the imaging system's resolution and
image quality. The neural network approach described in this paper is shown to present a viable means for
implementing turbulence compensation techniques for near-field and distributed turbulence scenarios. Representative
high-speed neural network hardware is presented. Existing 2-D cellular neural network (CNN) hardware is capable of 3
trillion operations per second with peta-operations per second possible using current 3-D manufacturing processes.
This hardware can be used for high-speed applications that require fast convolutions and de-convolutions. Existing 3-D
artificial neural network technology is capable of peta-operations per second and can be used for fast array processing
operations. Methods for optical imaging through distributed turbulence are discussed, simulation results are presented
and computational and performance assessments are provided.
Phase differences on a sampled grid in the pupil plane of a coherent imaging system are used in conjunction with a hidden phase approach to estimate images of coherently illuminated objects in the presence of additive Gaussian noise. The imaging system is located in the far-field with respect to the illuminated objects. Conventional least squares and minimum variance image reconstruction approaches are shown to fail because of the presence of point discontinuities in the far-field speckle pattern. Conventional phase reconstruction techniques can not properly sense the phase effects resulting from these point discontinuities or branch points. However, these conventional image reconstruction techniques can be made to work with the addition of a hidden phase term that accounts for the phase effects resulting from the branch points. The hidden phase term is added to the results of both the least squares and minimum variance phase reconstructors and the addition of the hidden phase term is shown to successfully recover the images of several types of extended objects.
Sampled irradiance measurement in the image plane of a coherently illuminated object are used to estimate the 2D object brightness profile in a shear beam imaging configuration. A field correlation function is used in a minimum variance irradiance estimation algorithm to optimally estimate the objects 2D brightness from a collection of sampled irradiance measurements on a grid of points in the image plane. The efficacy of the reconstruction method is demonstrated by reconstructing simulated coherently illuminated images of a symmetric extended object, an asymmetric extended object and also a spatially distended point source object. A theoretical error metric is determined and shown to compare favorably with simulated results over a range of object sizes.
Recently, it has been shown that least squares phase estimators require the determination of a hidden phase to properly reconstruct a field that contains branch points. The branch points, resulting from zeros in the complex field, occur at random locations. These branch points introduce discontinuities in the 2D phase function that have a magnitude of 2(pi) . Conventional least squares and minimum variance phase reconstructors do not properly sense these discontinuities and therefore have difficulty in reconstructing the field resulting from a coherently illuminated object. Preliminary investigations are made in determining the utility of the hidden phase to the minimum variance based phase reconstructors. A simple 2D image model is reconstructed using the hidden phase adjustments.
The Air Force Office of Scientific Research (AFOSR) is launching a research program in imaging physics planned to start in fiscal year 1997 (FY97). Both active (man made illumination sources) and passive (solar illuminated) imaging methods will be included in the program. The purpose of the program is to develop a national thrust for imaging science which will lay the foundation for future Air Force imaging systems. The new imaging physics program will be jointly administered from the Directorate of Physics and Electronics (AFOSR/NE) and the Directorate of Mathematics and Geosciences (AFOSR/NM) with collaborations with the Directorate of Life Sciences. The combined NE, NM, and NL imaging program will apply innovative mathematical formalisms (wavelets, non-linear partial differential equations, inverse methods, statistical techniques, optimization methods . . .) to the imaging problem (object representation, atmospheric turbulence compensation and noise modeling, innovative imaging techniques, multi- spectral imaging, data and sensor fusion, smart sensors, imaging neural nets, phase retrieval, . . .). The electronic emulation of biological vision processes for intelligent information identification and extraction in a timely manner are also of interest. A description of AFOSR and the current and planned imaging physics program are presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.