- Main
Approximate Inference in Graphical Models
- Forouzan, Sholeh
- Advisor(s): Ihler, Alexander
Abstract
Graphical models have become a central paradigm for knowledge representation and rea- soning over models with large numbers of variables. Any useful application of these models involves inference, or reasoning about the state of the underlying variables and quantify- ing the models’ uncertainty about any assignment to them. Unfortunately, exact inference in graphical models is fundamentally intractable, which has led to significant interest in approximate inference algorithms.
In this thesis we address several aspects of approximate inference that affect its quality. First, combining the ideas from variational inference and message passing on graphical models, we study how the regions over which the approximation is formed can be selected more effectively using a content-based scoring function that computes a local measure of the improvement to the upper bound to log partition function. We then extend this framework to use the available memory more efficiently, and show that this leads to better approximations. We propose different memory allocation strategies and empirically show how they can improve the quality of the approximation to the upper bound. Finally, we address the optimization algorithms used in approximate inference tasks. Focusing on maximum a posteriori (MAP) inference and linear programming (LP), we show how the Alternating Direction Method of Multipliers (ADMM) technique can provide an elegant algorithm for finding the saddle point of the augmented Lagrangian of the approximation, and present an ADMM-based algorithm to solve the primal form of the MAP-LP whose closed form updates are based on a linear approximation technique.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-