The US Army Aviation and Missile Research, Development and Engineering Center (AMRDEC) and the Redstone Test
Center (RTC) has formed the Scene Generation Development Center (SGDC) to support the Department of Defense
(DoD) open source EO/IR Scene Generation initiative for real-time hardware-in-the-loop and all-digital simulation.
Various branches of the DoD have invested significant resources in the development of advanced scene and target
signature generation codes. The SGDC goal is to maintain unlimited government rights and controlled access to
government open source scene generation and signature codes. In addition, the SGDC provides development support to a
multi-service community of test and evaluation (T&E) users, developers, and integrators in a collaborative environment.
The SGDC has leveraged the DoD Defense Information Systems Agency (DISA) ProjectForge
(https://Project.Forge.mil) which provides a collaborative development and distribution environment for the DoD
community. The SGDC will develop and maintain several codes for tactical and strategic simulation, such as the Joint
Signature Image Generator (JSIG), the Multi-spectral Advanced Volumetric Real-time Imaging Compositor (MAVRIC),
and Office of the Secretary of Defense (OSD) Test and Evaluation Science and Technology (T&E/S&T) thermal
modeling and atmospherics packages, such as EOView, CHARM, and STAR. Other utility packages included are the
ContinuumCore for real-time messaging and data management and IGStudio for run-time visualization and scenario
generation.
AMRDEC has developed the Multi-spectral Advanced Volumetric Real-time Imaging Compositor (MAVRIC) prototype
for distributed real-time hardware-in-the-loop (HWIL) scene generation. MAVRIC is a dynamic object-based energy
conserved scene compositor that can seamlessly convolve distributed scene elements into temporally aligned physicsbased
scenes for enhancing existing AMRDEC scene generation codes. The volumetric compositing process accepts
input independent of depth order. This real-time compositor framework is built around AMRDEC's ContinuumCore
API which provides the common messaging interface leveraging the Neutral Messaging Language (NML) for local,
shared memory, reflective memory, network, and remote direct memory access (RDMA) communications and the Joint
Signature Image Generator (JSIG) that provides energy conserved scene component interface at each render node. This
structure allows for a highly scalable real-time environment capable of rendering individual objects at high fidelity while
being considerate of real-time hardware-in-the-loop concerns, such as latency. As such, this system can be scaled to
handle highly complex detailed scenes such as urban environments. This architecture provides the basis for common
scene generation as it provides disparate scene elements to be calculated by various phenomenology codes and
integrated seamlessly into a unified composited environment. This advanced capability is the gateway to higher fidelity
scene generation such as ray-tracing. The high speed interconnects using PCI Express and InfiniBand were examined to
support distributed scene generation whereby the scene graph, associated phenomenology, and the scene elements can be
dynamically distributed across multiple high performance computing assets to maximize system performance.
AMRDEC sought out an improved framework for real-time hardware-in-the-loop (HWIL) scene generation to provide
the flexibility needed to adapt to rapidly changing hardware advancements and provide the ability to more seamlessly
integrate external third party codes for Best-of-Breed real-time scene generation. As such, AMRDEC has developed
Continuum, a new software architecture foundation to allow for the integration of these codes into a HWIL lab facility
while enhancing existing AMRDEC HWIL scene generation codes such as the Joint Signature Image Generator (JSIG).
This new real-time framework is a minimalistic modular approach based on the National Institute of Standards (NIST)
Neutral Messaging Language (NML) that provides the basis for common HWIL scene generation. High speed
interconnects and protocols were examined to support distributed scene generation whereby the scene graph, associated
phenomenology, and resulting scene can be designed around the data rather than a framework, and the scene elements
can be dynamically distributed across multiple high performance computing assets. Because of this open architecture
approach, the framework facilitates scaling from a single GPU "traditional" PC scene generation system to a multi-node
distributed system requiring load distribution and scene compositing across multiple high performance computing
platforms. This takes advantage of the latest advancements in GPU hardware, such as NVIDIA's Tesla and Fermi
architectures, providing an increased benefit in both fidelity and performance of the associated scene's phenomenology.
Other features of the Continuum easily extend the use of this framework to include visualization, diagnostic, analysis,
configuration, and other HWIL and all digital simulation tools.
AMRDEC has developed real-time rendering techniques for generating a real-time dynamic physics-based high altitude
wake geometric model for ablating objects re-entering the Earth's atmosphere. Computation was optimized for COTS
graphics processing unit (GPU) hardware using minimal preprocessing for operating in AMRDEC's Hardware-in-the-
Loop (HWIL) facility. These techniques are built around the Joint Signature Image Generator (JSIG) framework and
involve a five stage render process per frame. JSIG's zoom anti-aliasing algorithms were used to preserve depth buffer
information required by several of the render stages. New key features of this new modeling technique are dynamic flow
field mesh generation based upon an object's arbitrary angle-of-attack and volumetric line-of-site integration. The
concepts developed under this effort can be extended to other areas, such as atmospherics and cloud modeling, plume
modeling, and marine effects.
KEYWORDS: Zoom lenses, Calibration, High dynamic range imaging, Projection systems, Convolution, Error analysis, Commercial off the shelf technology, Image processing, OpenGL, Linear filtering
AMRDEC has developed and implemented new techniques for rendering real-time 32-bit floating point energy-conserved
dynamic scenes using commercial-off-the-shelf (COTS) Personal Computer (PC) based hardware and high
performance nVidia Graphics Processing Units (GPU). The AMRDEC IGStudio rendering framework with the real-time
Joint Scientific Image Generator (JSIG) core has been integrated into numerous AMRDEC Hardware-in-the-loop
(HWIL) facilities, successfully replacing the lower fidelity legacy SGI hardware and software. JSIG uses high dynamic
range unnormalized radiometric 32-bit floating point rendering through the use of GPU frame buffer objects (FBOs). A
high performance nested zoom anti-aliasing (NZAA) technique was developed to address performance and geometric
errors of past zoom anti-aliasing (ZAA) implementations. The NZAA capability for multi-object and occluded object
representations includes: cluster ZAA, object ZAA, sub-object ZAA, and point source generation for unresolved objects.
This technique has an optimal 128x128 pixel asymmetrical field-of-view zoom. The current NZAA capability supports
up to 8 objects in real-time with a near future capability of increasing to a theoretical 128 objects in real-time. JSIG
performs other dynamic entity effects which are applied in vertex and fragment shaders. These effects include floating
point dynamic signature application, dynamic model ablation heating models, and per-material thermal emissivity rolloff
interpolated on a per-pixel zoomed window basis. JSIG additionally performs full scene per-pixel effects in a post
render process. These effects include real-time convolutions, optical scene corrections, per-frame calibrations, and
energy distribution blur used to compensate for projector element energy limitations.
AMRDEC has successfully tested hardware and software for Real-Time Scene Generation for IR and SAL Sensors on COTS PC based hardware and video cards. AMRDEC personnel worked with nVidia and Concurrent Computer Corporation to develop a Scene Generation system capable of frame rates of at least 120Hz while frame locked to an external source (such as a missile seeker) with no dropped frames. Latency measurements and image validation were performed using COTS and in-house developed hardware and software. Software for the Scene Generation system was developed using OpenSceneGraph.
This paper describes the current research and development of advanced scene generation technology for integration into the Advanced Multispectral Simulation Test and Acceptance Resource (AMSTAR) Hardware-in-the-Loop (HWIL) facilities at the US Army AMRDEC and US Army Redstone Technical Test Center at Redstone Arsenal, AL. A real-time multi-mode (infra-red (IR) and semi-active laser (SAL)) scene generator for a tactical sensor system has been developed leveraging COTS hardware and open source software (OSS). A modular, plug-in architecture has been developed that supports rapid reconfiguration to permit the use of a variety of state data input sources, geometric model formats, and signature and material databases. The platform-independent software yields a cost-effective upgrade path to integrate best-of-breed personal computer (PC) graphics processing unit (GPU) technology.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.