Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published 2009 | Published
Book Section - Chapter Open

Infinite State Bayesian Networks

Abstract

A general modeling framework is proposed that unifies nonparametric-Bayesian models, topic-models and Bayesian networks. This class of infinite state Bayes nets (ISBN) can be viewed as directed networks of 'hierarchical Dirichlet processes' (HDPs) where the domain of the variables can be structured (e.g. words in documents or features in images). We show that collapsed Gibbs sampling can be done efficiently in these models by leveraging the structure of the Bayes net and using the forward-filtering-backward-sampling algorithm for junction trees. Existing models, such as nested-DP, Pachinko allocation, mixed membership stochastic block models as well as a number of new models are described as ISBNs. Two experiments have been performed to illustrate these ideas.

Additional Information

© 2009 Neural Information Processing Systems Foundation. This material is based upon work supported by the National Science Foundation under Grant No. 0447903 and No. 0535278 and by ONR under Grant No. 00014-06-1-0734.

Attached Files

Published - 3310-infinite-state-bayes-nets-for-structured-domains.pdf

Files

3310-infinite-state-bayes-nets-for-structured-domains.pdf
Files (234.3 kB)

Additional details

Created:
August 20, 2023
Modified:
January 13, 2024