Skip to main content
eScholarship
Open Access Publications from the University of California

UC Santa Barbara

UC Santa Barbara Electronic Theses and Dissertations bannerUC Santa Barbara

Dynamical Systems and Neural Networks: From Implicit Models to Memory Retrieval

Abstract

The field of machine learning has seen great strides in the past two decades, leading to groundbreaking advancements in various domains. Yet many challenges remain in AI research. Scalability remains a persistent hurdle, with current models struggling to efficiently manage increasing amounts of data and complexity. The challenge of scalability is further highlighted by the power of the human brain, which, with its remarkable efficiency, underscores the limitations of current machine learning models. Finally, both industry and academia have faced considerable difficulties in advancing robotics, particularly in enabling neural networks to interact effectively with the physical world.

In this work, we explore how the lens of dynamical systems can address all three of these challenges. We tackle memory constraints by framing a quantization algorithm as a fixed-point equation, enabling efficient differentiation. We investigate an energy-based, biological model of human memory and reinterpret the widely used self-attention mechanism through this model. Lastly, we leverage contraction theory to train a neural network that can follow a trajectory with stability and robustness. In the first two problems, we utilize dynamical systems to differentiate neural networks, while in the latter, we employ neural networks to learn a dynamical system. Through our understanding of dynamical systems, we build on theoretical advancements and practical applications in AI, offering new insights into both memory optimization and robust robotic manipulation.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View