Skip to main content
eScholarship
Open Access Publications from the University of California

UC Davis

UC Davis Electronic Theses and Dissertations bannerUC Davis

Towards Computational and Sample Efficiency in Stochastic Optimization

Abstract

Stochastic optimization is a crucial tool in machine learning, statistics, and operations research, and developing efficient algorithms for stochastic optimization is of great importance. This dissertation focuses on stochastic composite optimization, where the objective function is composed of a smooth expected value function and a deterministic non-smooth component. We propose a class of algorithms called proximal averaged stochastic approximation (Prox-ASA), which estimates the gradient using a moving average approach. We prove the theoretical convergence of Prox-ASA to a first-order stationary point in different settings, including expectation, high probability, and almost surely asymptotically. In addition, we show that Prox-ASA can be applied to address decentralized problems and stochastic compositional optimization problems. For the non-convex constrained setting with expensive projection, we propose a novel class of conditional gradient based algorithms for solving stochastic multi-level compositional optimization problems that obtain the same sample complexity of the single-level setting under standard assumptions. Lastly, we demonstrate that by leveraging interpolation-like conditions satisfied by overparameterized models, we can improve the oracle complexities of stochastic conditional gradient methods.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View