Skip to main content
eScholarship
Open Access Publications from the University of California

UC Irvine

UC Irvine Electronic Theses and Dissertations bannerUC Irvine

Improving Statistical Inference through Flexible Approximations

Creative Commons 'BY' version 4.0 license
Abstract

In the statistics and machine learning communities, there exists a perceived dichotomy be- tween statistical inference and out-of-sample prediction. Statistical inference is often done with models that are carefully specified a priori while out-of-sample prediction is often done with “black-box” models that have greater flexibility. The former is more concerned with model theoretical properties when data become infinite; the later focuses more on algorithms that scale up to larger data sets. To a scientist who is outside of these communities, the distinction of inference and prediction might not seem so clear. With technological advancements, scientists can now collect overwhelming amounts of data in various formats and their objective is to make sense of the data. To this end, we propose the synergy of statistical inference and prediction workhorses that are neural networks and Gaussian processes. Despite hardware improvements under Moore’s law, ever bigger data and more complex models pose computational challenges for statistical inference. To address these computational challenges, we approximate functional forms of the data to effectively reduce the burden of model evaluation. In addition, we present a case study where we use flexible models to learn scientifically interesting representations of rat memories from experimental data for better understanding of the brain.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View