Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Previously Published Works bannerUCLA

Universal Approximation Depth and Errors of Narrow Belief Networks with Discrete Units

Published Web Location

https://arxiv.org/abs/1303.7461
No data is associated with this publication.
Abstract

We generalize recent theoretical work on the minimal number of layers of narrow deep belief networks that can approximate any probability distribution on the states of their visible units arbitrarily well. We relax the setting of binary units (Sutskever & Hinton, 2008 ; Le Roux & Bengio, 2008 , 2010 ; Montúfar & Ay, 2011 ) to units with arbitrary finite state spaces and the vanishing approximation error to an arbitrary approximation error tolerance. For example, we show that a q-ary deep belief network with L > or = 2 + (q[m-delta]-1 / (q-1)) layers of width n < or = + log(q) (m) + 1 for some [Formula : see text] can approximate any probability distribution on {0, 1, ... , q-1}n without exceeding a Kullback-Leibler divergence of delta. Our analysis covers discrete restricted Boltzmann machines and naive Bayes models as special cases.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Item not freely available? Link broken?
Report a problem accessing this item