- Marshall, MR;
- Hellfeld, D;
- Joshi, THY;
- Salathe, M;
- Bandstra, MS;
- Bilton, KJ;
- Cooper, RJ;
- Curtis, JC;
- Negut, V;
- Shurley, AJ;
- Vetter, K
Networked detector systems can be deployed in urban environments to aid in the detection and localization of radiological and/or nuclear material. However, effectively responding to and interpreting a radiological alarm using spectroscopic data alone may be hampered by a lack of situational awareness, particularly in complex environments. This study investigates the use of Light Detection and Ranging (LiDAR) and streaming video to enable real-time object detection and tracking, and the fusion of this tracking information with radiological data for the purposes of enhanced situational awareness and increased detection sensitivity. This work presents an object detection, tracking, and novel source-object attribution analysis that is capable of operating in real time. By implementing this analysis pipeline on a custom-developed system that comprises a static 2 in. \times 4 in. \times16 in. NaI(Tl) detector colocated with a 64-beam LiDAR and four monocular cameras, we demonstrate the ability to accurately correlate trajectories from tracked objects to spectroscopic gamma-ray data in real time and use physics-based models to reliably discriminate between source-carrying and nonsource-carrying objects. In this work, we describe our approach in detail and present a quantitative performance assessment that characterizes the source-object attribution capabilities of both video and LiDAR. Additionally, we demonstrate the ability to simultaneously track pedestrians and vehicles in a mock urban environment and use this tracking information to improve both detection sensitivity and situational awareness using our contextual-radiological data fusion methodology.