Ambiguity Rate of Hidden Markov Processes
Skip to main content
eScholarship
Open Access Publications from the University of California

UC Davis

UC Davis Previously Published Works bannerUC Davis

Ambiguity Rate of Hidden Markov Processes

Abstract

The $\epsilon$-machine is a stochastic process' optimal model -- maximally predictive and minimal in size. It often happens that to optimally predict even simply-defined processes, probabilistic models -- including the $\epsilon$-machine -- must employ an uncountably-infinite set of features. To constructively work with these infinite sets we map the $\epsilon$-machine to a place-dependent iterated function system (IFS) -- a stochastic dynamical system. We then introduce the ambiguity rate that, in conjunction with a process' Shannon entropy rate, determines the rate at which this set of predictive features must grow to maintain maximal predictive power. We demonstrate, as an ancillary technical result which stands on its own, that the ambiguity rate is the (until now missing) correction to the Lyapunov dimension of an IFS's attractor. For a broad class of complex processes and for the first time, this then allows calculating their statistical complexity dimension -- the information dimension of the minimal set of predictive features.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View