Our group has developed a custom, multi-modal sensor suite and data analysis pipeline to phenotype crops in the field using unpiloted aircraft systems (UAS). This approach to high-throughput field phenotyping is part of a research initiative intending to markedly accelerate the breeding process for refined energy sorghum varieties. To date, single rotor and multirotor helicopters, roughly 14 kg in total weight, are being employed to provide sensor coverage over multiple hectaresized fields in tens of minutes. The quick, autonomous operations allow for complete field coverage at consistent plant and lighting conditions, with low operating costs.
The sensor suite collects data simultaneously from six sensors and registers it for fusion and analysis. High resolution color imagery targets color and geometric phenotypes, along with lidar measurements. Long-wave infrared imagery targets temperature phenomena and plant stress. Hyperspectral visible and near-infrared imagery targets phenotypes such as biomass and chlorophyll content, as well as novel, predictive spectral signatures. Onboard spectrometers and careful laboratory and in-field calibration techniques aim to increase the physical validity of the sensor data throughout and across growing seasons. Off-line processing of data creates basic products such as image maps and digital elevation models. Derived data products include phenotype charts, statistics, and trends.
The outcome of this work is a set of commercially available phenotyping technologies, including sensor suites, a fully integrated phenotyping UAS, and data analysis software. Effort is also underway to transition these technologies to farm management users by way of streamlined, lower cost sensor packages and intuitive software interfaces.
Sanjiv Singh, Gary Sherwin, Regis Hoffman, Benjamin Grocholsky, Volker Grabe, Samuel Nalbone, Lyle Chamberlain, Spencer Spiker, Marcel Bergerman, Colin Wilkinson, David Findlay
The Navy and Marine Corps will increasingly need to operate unmanned air vehicles from ships at sea. Fused multi-sensor systems are desirable to ensure these operations are highly reliable under the most demanding at-sea conditions, particularly in degraded visual environments. The US Navy Sea-Based Automated Launch & Recovery System (SALRS) program aims at enabling automated/semi-automated launch and recovery of sea-based, manned and unmanned, fixed- and rotary-wing naval aircraft, and to utilize automated or pilot-augmented flight mechanics for carefree shipboard operations. This paper describes the goals and current results of SALRS Phase 1, which aims at understanding the capabilities and limitations of various sensor types through sensor characterization, modeling, and simulation, and assessing how the sensor models can be used for aircraft navigation to provide sufficient accuracy, integrity, continuity, and availability across all anticipated maritime conditions.
The ability to perform off-road autonomous navigation at any time of day or night is a requirement for some unmanned
ground vehicle (UGV) programs. Because there are times when it is desirable for military UGVs to operate without
emitting strong, detectable electromagnetic signals, a passive only terrain perception mode of operation is also often a
requirement. Thermal infrared (TIR) cameras can be used to provide day and night passive terrain perception. TIR
cameras have a detector sensitive to either mid-wave infrared (MWIR) radiation (3-5μm) or long-wave infrared (LWIR)
radiation (7-14μm). With the recent emergence of high-quality uncooled LWIR cameras, TIR cameras have become
viable passive perception options for some UGV programs. The Jet Propulsion Laboratory (JPL) has used a stereo pair
of TIR cameras under several UGV programs to perform stereo ranging, terrain mapping, tree-trunk detection,
pedestrian detection, negative obstacle detection, and water detection based on object reflections. In addition, we have
evaluated stereo range data at a variety of UGV speeds, evaluated dual-band TIR classification of soil, vegetation, and
rock terrain types, analyzed 24 hour water and 12 hour mud TIR imagery, and analyzed TIR imagery for hazard
detection through smoke. Since TIR cameras do not currently provide the resolution available from megapixel color
cameras, a UGV's daytime safe speed is often reduced when using TIR instead of color cameras. In this paper, we
summarize the UGV terrain perception work JPL has performed with TIR cameras over the last decade and describe a
calibration target developed by General Dynamics Robotic Systems (GDRS) for TIR cameras and other sensors.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.