Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Symplectic Numerical Integration at the service of Accelerated Optimization and Structure-Preserving Dynamics Learning

Abstract

Symplectic numerical integrators for Hamiltonian systems form the paramount class of geometric numerical integrators, and have been very well investigated in the past forty years. By preserving the symplecticity of the flow of Hamiltonian systems, symplectic integrators generate discrete solutions which enjoy many desirable qualitative properties such as excellent long-time near-energy preservation over exponentially-long time intervals, and as a result typically exhibit superior numerical stability and better long-time fidelity to the continuous dynamics they are resolving.

The purpose of this dissertation is to establish, explore and exploit connections between symplectic numerical integration and two prominent disciplines of scientific computing: accelerated optimization for machine learning applications, and structure-preserving dynamics learning. By leveraging the well-established theory of symplectic integration, we aspire to design new algorithms for accelerated optimization and dynamics learning with superior numerical and computational properties.

Most machine learning algorithms are designed around the minimization of a loss function or the maximization of a likelihood function. Due to the ever-growing size of the datasets, obtaining efficient optimization algorithms which can be executed on parallel and distributed processing architectures is of critical importance. Numerous accelerated optimization algorithms limit to differential equations as the step size goes to zero, and the objective function typically converges to its optimal value at an accelerated rate along the trajectories of these limiting differential equations. As a result, the optimization problem can be replaced by the problem of evolving dynamics governed by appropriately defined differential equations. In Part II, we construct novel optimization algorithms via symplectic integration of carefully designed Hamiltonian systems which converge to the minimizer at an accelerated rate, both on normed vector spaces and Riemannian manifolds.

Identifying accurate and efficient models for dynamical systems based on observed trajectories is crucial for predicting and controlling their future behavior. Incorporating the structure underlying the dynamics in deep learning architectures has proven to be an efficient approach to learning structured dynamical systems. In Part III, we design novel deep learning architectures incorporating the geometric structure of nearly-periodic Hamiltonian systems and controlled Lie group Hamiltonian systems to learn structure-preserving surrogate evolution maps.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View