This dissertation presents the development of agile sensors and streaming frameworks for real-time immersive visual analytics. I explore the mechanisms required to collect, transport, and contextually process data prior to visualization, with the intent of automatically producing navigable datasets, ideally in -- or with a natural development path to -- real-time interaction scenarios. These contributions build on the formal definitions and thorough foundations of visual analytics in prior literature, which aspire to empower discovery and dissemination by closing the interpretive divide between digital data and the expert perception of the human mind. My contributions extend visual analytics techniques to new applications which connect directly and interactively not with stored data, but directly to the world through data collected in real-time. These efforts propel the ultimate objective of capturing the state of natural real world processes and subsequently transform the captured data over scales of time, scale, distance, and spectra to produce immersive presentations suitable for human processing in real-time. Imposing autonomy and performance constraints on the data creation processes of a visual analytics application simultaneously requires the integration of external sensors, controls for systematic automated acquisition, mechanisms for transport and storage of sizable volumes of data, thoughtful design of dataflow and processing architecture, and integration to an immersive display environment. This work focuses on exploring theoretical and practical aspects of addressing these challenges, as encountered during the development of two instruments for real-time immersive visual analytics: a large-format industrial radiography scanner and an unmanned airborne imaging platform. Validated by these case studies, I present the results of new research which develop the agile sensors and data streaming frameworks necessary to produce data-driven real-time visual analytics applications