Skip to main content
eScholarship
Open Access Publications from the University of California

UC Davis

UC Davis Electronic Theses and Dissertations bannerUC Davis

New Algorithms for Riemannian Optimization and Minimax Problems with Machine Learning Applications

Abstract

Machine learning (ML) has always been a wonderful tool on helping people solving real world problems, like clustering, classification and regression. But without optimization, ML can not shine! Optimization as a mathematical tool, is essential to ML since people use optimization techniques on building efficient algorithms to train different ML models on finding the optimal point of the loss function. In this dissertation, by using the technique in Riemannian optimization and minimax optimization, we provide two new algorithms on solving clustering and blackbox attacking problems.

In the second chapter of the thesis, we focus on designing manifold proximal linear algorithm on solving sparse spectral clustering problem. We prove that our algorithm could find an $\epsilon$-stationary point within $\cO(\epsilon^{-2})$ iterations. Numerical implementation of our algorithm on clustering task shows the advantage of our algorithm in terms of both clustering quality and run time compared with existing methods.

In the third chapter of the thesis, we consider minimax problems when gradient is not available to calculate. We first present four algorithms in both deterministic and stochastic setting. We proved that our algorithms have improved oracle complexity when compared to the current best known SOTA (\cite{liu2019min}) in both deterministic and stochastic setting. We also present the usage of our algorithm in the distributed robust optimization problem.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View