Infinite State Bayesian Networks
- Creators
- Welling, Max
- Porteous, Ian
- Bart, Evgeniy
Abstract
A general modeling framework is proposed that unifies nonparametric-Bayesian models, topic-models and Bayesian networks. This class of infinite state Bayes nets (ISBN) can be viewed as directed networks of 'hierarchical Dirichlet processes' (HDPs) where the domain of the variables can be structured (e.g. words in documents or features in images). We show that collapsed Gibbs sampling can be done efficiently in these models by leveraging the structure of the Bayes net and using the forward-filtering-backward-sampling algorithm for junction trees. Existing models, such as nested-DP, Pachinko allocation, mixed membership stochastic block models as well as a number of new models are described as ISBNs. Two experiments have been performed to illustrate these ideas.
Additional Information
© 2009 Neural Information Processing Systems Foundation. This material is based upon work supported by the National Science Foundation under Grant No. 0447903 and No. 0535278 and by ONR under Grant No. 00014-06-1-0734.Attached Files
Published - 3310-infinite-state-bayes-nets-for-structured-domains.pdf
Files
Name | Size | Download all |
---|---|---|
md5:31e56f142cc04ba59d94a0a7064baeae
|
234.3 kB | Preview Download |
Additional details
- Eprint ID
- 65751
- Resolver ID
- CaltechAUTHORS:20160329-153942844
- NSF
- 0447903
- NSF
- 0535278
- Office of Naval Research (ONR)
- 00014-06-1-0734
- Created
-
2016-03-30Created from EPrint's datestamp field
- Updated
-
2019-10-03Created from EPrint's last_modified field
- Series Name
- Advances in Neural Information Processing Systems
- Series Volume or Issue Number
- 20