Skip to main content
eScholarship
Open Access Publications from the University of California

Estimating Predictive Rate-Distortion Curves via Neural Variational Inference.

  • Author(s): Hahn, Michael
  • Futrell, Richard
  • et al.

Published Web Location

https://doi.org/10.3390/e21070640
Abstract

The Predictive Rate-Distortion curve quantifies the trade-off between compressing information about the past of a stochastic process and predicting its future accurately. Existing estimation methods for this curve work by clustering finite sequences of observations or by utilizing analytically known causal states. Neither type of approach scales to processes such as natural languages, which have large alphabets and long dependencies, and where the causal states are not known analytically. We describe Neural Predictive Rate-Distortion (NPRD), an estimation method that scales to such processes, leveraging the universal approximation capabilities of neural networks. Taking only time series data as input, the method computes a variational bound on the Predictive Rate-Distortion curve. We validate the method on processes where Predictive Rate-Distortion is analytically known. As an application, we provide bounds on the Predictive Rate-Distortion of natural language, improving on bounds provided by clustering sequences. Based on the results, we argue that the Predictive Rate-Distortion curve is more useful than the usual notion of statistical complexity for characterizing highly complex processes such as natural language.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Main Content
Current View