Nuclear Scene Data Fusion (SDF) integrates radiation data and other contextual sensor measurements to enable free-moving localization and mapping of nuclear radiation from person or robot-borne sensor systems referred to as Localization and Mapping Platforms (LAMPs). The LAMP sensor suite typically utilizes lidar to create high-fidelity 3D point clouds representing the measurement scene. The high precision and accuracy of lidar has been essential to SDF, but its larger size, weight, power, and cost (SWaP-C) requirements limit its use in lightweight, portable applications, such as drones or handheld systems for remote field operations. In these cases, automotive millimeter-Wave radar presents a cost-effective, lightweight, and energy-efficient alternative, albeit with a significant compromise in spatial resolution and accuracy. We consider utilizing 2D radar point clouds to create occupancy maps of the scene. One method explored generates radar occupancy grids refined by a lidar-trained Pix2Pix conditional Generative Adversarial Network (cGAN) to approach lidar occupancy grid accuracy. We utilize these radar point occupancy grids to locate and quantify a gamma-ray point source in an environment never-before-seen by our lidar-trained Pix2Pix model, and we compare the results to those generated using standard lidar occupancy grids to assess the feasibility of using radar in place of lidar in some situations and environments.
|