We present an integrated approach for structure and parameter estimation in
latent tree graphical models. Our overall approach follows a
"divide-and-conquer" strategy that learns models over small groups of variables
and iteratively merges onto a global solution. The structure learning involves
combinatorial operations such as minimum spanning tree construction and local
recursive grouping; the parameter learning is based on the method of moments
and on tensor decompositions. Our method is guaranteed to correctly recover the
unknown tree structure and the model parameters with low sample complexity for
the class of linear multivariate latent tree models which includes discrete and
Gaussian distributions, and Gaussian mixtures. Our bulk asynchronous parallel
algorithm is implemented in parallel and the parallel computation complexity
increases only logarithmically with the number of variables and linearly with
dimensionality of each variable.