## Geometric Bayes

- Author(s): Holbrook, Andrew
- Advisor(s): Shahbaba, Babak
- et al.

## Abstract

This dissertation is an investigation into the intersections between differential geometry and Bayesian analysis. The former is the mathematical discipline that underlies our understanding of the spatial structure of the universe; the latter is the unified framework for statistical inference built upon the language of probability and the elegant Bayes' theorem. Here, the two disciplines are combined with the hope that a synergy might emerge and facilitate the useful application of Bayesian inference to real-world science. In particular, dynamic and high-dimensional neural data provides a challenging litmus test for the methods developed herein.

A major component of this work is the development and application of probabilistic models defined over smooth manifolds: dependencies between time series are modeled using the manifold of Hermitian positive definite matrices; probability density functions are modeled using the infinite sphere; and high-dimensional data are modeled using the Stiefel manifold of orthonormal matrices. Whereas formulating a manifold-based model is not difficult---in a certain sense, the geometry occurs a priori in each of the cases considered---the non-trivial geometry presents computational challenges for model-based inference. Hence, this thesis contributes two new algorithms for Bayesian inference on Riemannian manifolds. The first is an algorithm for inference over general Riemannian manifolds and is applied to inference on Hermitian positive definite matrices. The second is an algorithm for inference over manifolds that are embedded in Euclidean space and is applied to inference on the sphere and Stiefel manifolds.

This dissertation is ordered as follows. In Chapter 1, the general setting is introduced along with the rudiments of Riemannian geometry. In Chapter 2, the geodesic Lagrangian Monte Carlo algorithm is presented and used for Bayesian inference over the space of Hermitian positive definite matrices to learn the spectral densities of multivariate time series arising from local field potentials in a rodent brain. In Chapter 4, an alternative, conceptually simpler version of the geodesic Monte Carlo is developed, but the new algorithm requires differentiating the pseudo determinant, the derivative of which is derived in Chapter 3. In Chapter 5, the geometry of the infinite-dimensional sphere is leveraged for Bayesian nonparametric density estimation. In Chapter 6, high-dimensional spike trains and local field potentials in a rodent brain are used to predict environmental stimuli. This Bayesian `neural decoding' is facilitated by both geometric and non-geometric models. Chapter 7 charts the frontiers of Bayesian inference on infinite manifolds.