A 3D imaging pulsed LADAR with Geiger mode APD array was assembled,
flight-tested and deployed in response to a FEMA request for data collection
and debris estimation analysis to support the Hurricane Harvey relief effort in TX.
Here we report on the rapid response and application of this Geiger Mode APD
system to collect high area coverage rate data for geo-mapping and
debris volume estimation. MIT Lincoln Laboratory's Airborne Optical Systems
Testbed (AOSTB) hosted on a DeHavilland Twin Otter aircraft was flown to
collect LADAR imagery of Houston TX area inundated with over 50
inches of rainfall in 4 days and the Port Arthur coastal vicinity that
weathered Harvey's initial landfall. This testbed, which serves to advance
the Laboratory's effort to develop EO Sensor architectures, along with the
actions of a large dedicated team demonstrated the usefulness of this sensor
modality for Humanitarian Aid and Disaster Relief response.
Mohan Vaidyanathan, Steven Blask, Thomas Higgins, William Clifton, Daniel Davidsohn, Ryan Carson, Van Reynolds, Joanne Pfannenstiel, Richard Cannata, Richard Marino, John Drover, Robert Hatch, David Schue, Robert Freehart, Greg Rowe, James Mooney, Carl Hart, Byron Stanley, Joseph McLaughlin, Eui-In Lee, Jack Berenholtz, Brian Aull, John Zayhowski, Alex Vasile, Prem Ramaswami, Kevin Ingersoll, Thomas Amoruso, Imran Khan, William Davis, Richard Heinrichs
KEYWORDS: Sensors, LIDAR, 3D image processing, 3D acquisition, Target detection, Imaging systems, Image processing, Control systems, Image sensors, Data processing
Jigsaw three-dimensional (3D) imaging laser radar is a compact, light-weight system for imaging
highly obscured targets through dense foliage semi-autonomously from an unmanned aircraft. The
Jigsaw system uses a gimbaled sensor operating in a spot light mode to laser illuminate a cued
target, and autonomously capture and produce the 3D image of hidden targets under trees at high 3D
voxel resolution. With our MIT Lincoln Laboratory team members, the sensor system has been
integrated into a geo-referenced 12-inch gimbal, and used in airborne data collections from a UH-1
manned helicopter, which served as a surrogate platform for the purpose of data collection and
system validation. In this paper, we discuss the results from the ground integration and testing of the
system, and the results from UH-1 flight data collections. We also discuss the performance results
of the system obtained using ladar calibration targets.
Richard Marino, W. Davis, G. Rich, J. McLaughlin, E. Lee, B. Stanley, J. Burnside, G. Rowe, R. Hatch, T. Square, L. Skelly, M. O'Brien, A. Vasile, R. Heinrichs
Situation awareness and accurate Target Identification (TID) are critical requirements for successful battle management. Ground vehicles can be detected, tracked, and in some cases imaged using airborne or space-borne microwave radar. Obscurants such as camouflage net and/or tree canopy foliage can degrade the performance of such radars. Foliage can be penetrated with long wavelength microwave radar, but generally at the expense of imaging resolution. The goals of the DARPA Jigsaw program include the development and demonstration of high-resolution 3-D imaging laser radar (ladar) ensor technology and systems that can be used from airborne platforms to image and identify military ground vehicles that may be hiding under camouflage or foliage such as tree canopy. With DARPA support, MIT Lincoln Laboratory has developed a rugged and compact 3-D imaging ladar system that has successfully demonstrated the feasibility and utility of this application. The sensor system has been integrated into a UH-1 helicopter for winter and summer flight campaigns. The sensor operates day or night and produces high-resolution 3-D spatial images using short laser pulses and a focal plane array of Geiger-mode avalanche photo-diode (APD) detectors with independent digital time-of-flight counting circuits at each pixel. The sensor technology includes Lincoln Laboratory developments of the microchip laser and novel focal plane arrays. The microchip laser is a passively Q-switched solid-state frequency-doubled Nd:YAG laser transmitting short laser pulses (300 ps FWHM) at 16 kilohertz pulse rate and at 532 nm wavelength. The single photon detection efficiency has been measured to be > 20 % using these 32x32 Silicon Geiger-mode APDs at room temperature. The APD saturates while providing a gain of typically > 106. The pulse out of the detector is used to stop a 500 MHz digital clock register integrated within the focal-plane array at each pixel. Using the detector in this binary response mode simplifies the signal processing by eliminating the need for analog-to-digital converters and non-linearity corrections. With appropriate optics, the 32x32 array of digital time values represents a 3-D spatial image frame of the scene. Successive image frames illuminated with the multi-kilohertz pulse repetition rate laser are accumulated into range histograms to provide 3-D volume and intensity information. In this article, we describe the Jigsaw program goals, our demonstration sensor system, the data collection campaigns, and show examples of 3-D imaging with foliage and camouflage penetration. Other applications for this 3-D imaging direct-detection ladar technology include robotic vision, avigation of autonomous vehicles, manufacturing quality control, industrial security, and topography.
We present a pose-independent Automatic Target Detection and Recognition (ATD/R) System using data from an airborne 3D imaging ladar sensor. The ATD/R system uses geometric shape and size signatures from target models to detect and recognize targets under heavy canopy and camouflage cover in extended terrain scenes.
A method for data integration was developed to register multiple scene views to obtain a more complete 3-D surface signature of a target. Automatic target detection was performed using the general approach of “3-D cueing,” which determines and ranks regions of interest within a large-scale scene based on the likelihood that they contain the respective target. Each region of interest is further analyzed to accurately identify the target from among a library of 10 candidate target objects.
The system performance was demonstrated on five extended terrain scenes with targets both out in the open and under heavy canopy cover, where the target occupied 1 to 5% of the scene by volume. Automatic target recognition was successfully demonstrated for 20 measured data scenes including ground vehicle targets both out in the open and under heavy canopy and/or camouflage cover, where the target occupied between 5 to 10% of the scene by volume. Correct target identification was also demonstrated for targets with multiple movable parts that are in arbitrary orientations. We achieved a high recognition rate (over 99%) along with a low false alarm rate (less than 0.01%)
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.