Learning Topic Models and Latent Bayesian Networks Under Expansion Constraints
Skip to main content
eScholarship
Open Access Publications from the University of California

Learning Topic Models and Latent Bayesian Networks Under Expansion Constraints

  • Author(s): Anandkumar, A
  • Hsu, D
  • Javanmard, A
  • Kakade, SM
  • et al.
Abstract

Unsupervised estimation of latent variable models is a fundamental problem central to numerous applications of machine learning and statistics. This work presents a principled approach for estimating broad classes of such models, including probabilistic topic models and latent linear Bayesian networks, using only second-order observed moments. The sufficient conditions for identifiability of these models are primarily based on weak expansion constraints on the topic-word matrix, for topic models, and on the directed acyclic graph, for Bayesian networks. Because no assumptions are made on the distribution among the latent variables, the approach can handle arbitrary correlations among the topics or latent factors. In addition, a tractable learning method via $\ell_1$ optimization is proposed and studied in numerical experiments.

Many UC-authored scholarly publications are freely available on this site because of the UC Academic Senate's Open Access Policy. Let us know how this access is important for you.

Main Content
Current View