Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published July 2019 | Submitted + Supplemental Material + Published
Journal Article Open

Guaranteed Scalable Learning of Latent Tree Models

Abstract

We present an integrated approach to structure and parameter estimation in latent tree graphical models, where some nodes are hidden. Our overall approach follows a "divide-and-conquer" strategy that learns models over small groups of variables and iteratively merges into a global solution. The structure learning involves combinatorial operations such as minimum spanning tree construction and local recursive grouping; the parameter learning is based on the method of moments and on tensor decompositions. Our method is guaranteed to correctly recover the unknown tree structure and the model parameters with low sample complexity for the class of linear multivariate latent tree models which includes discrete and Gaussian distributions, and Gaussian mixtures. Our bulk asynchronous parallel algorithm is implemented in parallel and scales logarithmically with the number of variables and linearly with dimensionality of each variable.

Additional Information

© 2020 by the author(s). Huang is supported by startup fund from Department of Computer Science, University of Maryland, National Science Foundation IIS-1850220 CRII Award 030742- 00001, and Adobe, Capital One and JP Morgan faculty fellowships. Sun is supported by the National Science Foundation award IIS-1418511, CCF-1533768 and IIS-1838042, the National Institute of Health award 1R01MD011682-01 and R56HL138415. Anandkumar is supported in part by Bren endowed chair, Darpa PAI, Raytheon, and Microsoft, Google and Adobe faculty fellowships.

Attached Files

Published - huang20b.pdf

Submitted - 1406.4566.pdf

Supplemental Material - huang20b-supp.pdf

Files

huang20b-supp.pdf
Files (4.2 MB)
Name Size Download all
md5:fdc414b47ba35c76a363d3655f74e363
1.7 MB Preview Download
md5:145360b6d2469d008a89a1a5bb1e0cfa
2.1 MB Preview Download
md5:e82e1850e56f07e02caa6d3138942ec6
333.4 kB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 23, 2023