Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

A Formal Perspective on Hyperdimensional Computing

No data is associated with this publication.
Abstract

Hyperdimensional computing (HDC) is a paradigm, originating in the cognitive-neuroscience literature, for computing on high-dimensional and distributed representations of data. The technique is simple to implement, amenable to formal analysis, and accords naturally with energy-efficient and highly parallel hardware platforms like FPGAs and "processing-in-memory" architectures. Indeed, recent years have seen substantial interest in leveraging the principles of HDC to develop highly efficient specialized hardware for cognitive information processing tasks. In HDC, all computation is performed on high-dimensional, distributed, representations of data, which are constructed using a variety of different encoding (i.e. embedding) techniques. In this dissertation, I develop a set of formal tools for analyzing the basic capabilities of these representations, and for obtaining sufficient conditions under which they can be used to effect various cognitive information processing tasks like learning and recall. I first develop a framework, based on the notion of incoherence popularized in the compressed sensing literature, for analyzing conditions under which specific data items, and collections thereof, can be embedded into HD-space in a manner that permits reliable recovery. I analyze the performance of these architectures in the presence of noise, and provide guidance on how to choose the dimension of the HD-space, a crucial consideration in practice. In applications of HDC, the encoding operation often represents a significant computational burden. This is particularly true when the input data is high-dimensional to begin with. I explore encoding architectures based on hashing that can mitigate these issues in certain settings, and provide novel analyses showing that they can offer substantial performance improvements over competing techniques, while enjoying similar formal guarantees for learning tasks. Implementation in hardware confirms these predictions. Finally, recent years have seen significant interest in using HDC as a substrate for learning tasks, in particular, classification. I develop a formal model for learning with HDC using techniques from statistical learning theory and kernel methods, an enormously successful approach to learning that also relies on unique properties of high-dimensional embeddings of data. This work clarifies the situations under which learning from HD representations will be successful, and elucidates the connections between HDC and related areas in classical machine learning. In contrast to prior work, that has also considered some of the questions introduced above, the analyses developed in this dissertation do not require asymptotic approximations, and readily yield formal results in the setting of finite-dimensional encodings one is limited to in practice.

Main Content

This item is under embargo until September 21, 2024.