Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Variational and scale mixture representations of non- Gaussian densities for estimation in the Bayesian Linear Model : sparse coding, independent component analysis, and minimum entropy segmentation

Abstract

This thesis considers representations of non-Gaussian probability densities for use in various estimation problems associated with the Bayesian Linear Model. We define a class of densities that we call Strongly Super- Gaussian, and show the relationship of these densities to Gaussian Scale Mixtures, and densities with positive kurtosis. Such densities have been used to model "sparse" random variables, with densities that are sharply peaked with heavy tails. We show that strongly super-Gaussian densities are natural generalizations of Gaussian densities, and permit the derivation of monotonic iterative algorithms for parameter estimation in sparse coding in overcomplete signal dictionaries, blind source separation, independent component analysis, and blind multichannel deconvolution. Mixtures of strongly super- Gaussian densities can be used to model arbitrary densities with greater economy that a Gaussian mixture model. The framework is extended to multivariate dependency models for independent subspace analysis. We apply the methods to the estimation of neural electro- magnetic sources from electro-encephalogram recordings, and to sparse coding of images

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View