Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Electronic Theses and Dissertations bannerUCLA

New Algorithms in Computational Microscopy

Abstract

Microscopy plays an important role in providing tools to microscopically observe objects and their surrounding areas with much higher resolution ranging from the scale between molecular machineries (angstrom) and individual cells (micrometer). Under microscopes, illumination, such as visible light and electron-magnetic radiation/electron beam, interacts with samples, then they are scattered to a plane and are recorded. Computational microscopy corresponds to image reconstruction from these measurements as well as improving quality of the images. Along with the evolution of microscopy, new studies are discovered and algorithms need development not only to provide high-resolution imaging but also to decipher new and advanced research.

In this dissertation, we focus on algorithm development for inverse problems in microscopy, specifically phase retrieval and tomography, and the application of these techniques to machine learning. The four studies in this dissertation demonstrates the use of optimization and calculus of variation in imaging science and other different disciplines.

Study 1 focuses on coherent diffractive imaging (CDI) or phase retrieval, a non-linear inverse problem that aims to recover 2D image from it Fourier transforms in modulus taking into account that extra information provided by oversampling as a second constraint. To solve this two-constraint minimization, we proceed from Hamilton-Jacobi partial differential equation (HJ-PDE) and its Hopf-Lax formula. Introducing generalized Bregman distance to the HJ-PDE and applying Legendre transform, we derive our generalized proximal smoothing (GPS) algorithm under the form of primal-dual hybrid gradient (PDHG). While the reflection operator, known as extrapolating momentum, helps overcome local minima, the smoothing by the generalized Bregman distance is adjusted to improve convergence and consistency of phase retrieval.

Study 2 focuses on electron tomography, 3D image reconstruction from a set of 2D projections obtained from a transmission electron microscope (TEM) or X-ray microscope. Notice that current tomography algorithms limit to a single tilt axis and fail to work with fully or partially missing data. In the light of calculus of variations and Fourier slice theorem (FST), we develop a highly accurate tomography iterative algorithm that can provide higher resolution imaging and work with missing data as well as has capability to perform multiple-tilt-axis tomography. The algorithm is further developed to work with non-isolated objects and partially-blocked projections which have become more popular in experiment. The success of real space iterative reconstruction engine (RESIRE) opens a new era to the study of tomography in material science and magnetic structures (vector Tomography).

Study 3 and 4 are applications of our algorithms to machine learning. Study 3 develops a backward Euler method in a stochastic manner to solve K-mean clustering, a well-known non-convex optimization problem. The algorithm has been shown to improve minimums and consistency, providing a new powerful tool to the class of classification techniques. Study 4 is a direct application of GPS to deep learning gradient descent algorithms. Linearizing the Hopf-Lax formula derived in GPS, we derive our method Laplacian smoothing gradient descent (LSGD), simply known as gradient smoothing. Our experiment shows that LSGD has the ability to search for better and flatter minimums, reduce variation, and obtain higher accuracy and consistency.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View