Computational Imaging and Sensing in Diagnostics with Deep Learning
Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Electronic Theses and Dissertations bannerUCLA

Computational Imaging and Sensing in Diagnostics with Deep Learning

Abstract

Computational imaging and sensing aim to redesign optical systems from the ground up, jointly considering both hardware/sensors and software/reconstruction algorithms to enable new modalities with superior capabilities, speed, cost, and/or footprint. Often systems can be optimized with targeted applications in mind, such as low-light imaging or remote sensing in a specific spectral regime. For medical diagnostics in particular, computational sensing could enable more portable, cost-effective systems and in turn improve access to care. In the last decade, the increased availability of data and cost-effective computational resources coupled with the commodification of neural networks has accelerated and expanded the potential for these computational sensing systems.First, I will present my work on a cost-effective system for quantifying antimicrobial resistance, which could be of particular use in resource-limited settings, where poverty, population density, and lack of healthcare infrastructure lead to the emergence of some of the most resistant strains of bacteria. The device uses optical fibers to spatially subsample all 96 wells of a standard microplate without any scanning components, and a neural network identifies bacterial growth from the optical intensity information captured by the fibers. Our accelerated antimicrobial susceptibility testing system can interface with the current laboratory workflow and, when blindly tested on patient bacteria at UCLA Health, was able to identify bacterial growth after an average of 5.72 h, as opposed to the gold standard method requiring 18–24 h. The system is completely automated, avoiding the need for a trained medical technologist to manually inspect each well of a standard 96-well microplate for growth. Second, I will discuss a deep learning-enabled spectrometer framework using localized surface plasmon resonance. By fabricating an array of periodic nanostructures with varying geometries, we created a “spectral encoder chip” whose spatial transmission intensity depends upon the incident spectrum of light. A neural network uses the transmitted intensities captured by a CMOS image sensor to faithfully reconstruct the underlying spectrum. Unlike conventional diffraction-based spectrometers, this framework is scalable to large areas through imprint lithography, conducive to compact, lightweight designs, and, crucially, does not suffer from the resolution–signal strength tradeoff inherent to grating-based designs.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View