Bayesian Estimation of Latently-grouped Parameters in Undirected Graphical Models.
In large-scale applications of undirected graphical models, such as social networks and biological networks, similar patterns occur frequently and give rise to similar parameters. In this situation, it is beneficial to group the parameters for more efficient learning. We show that even when the grouping is unknown, we can infer these parameter groups during learning via a Bayesian approach. We impose a Dirichlet process prior on the parameters. Posterior inference usually involves calculating intractable terms, and we propose two approximation algorithms, namely a Metropolis-Hastings algorithm with auxiliary variables and a Gibbs sampling algorithm with "stripped" Beta approximation (Gibbs_SBA). Simulations show that both algorithms outperform conventional maximum likelihood estimation (MLE). Gibbs_SBA's performance is close to Gibbs sampling with exact likelihood calculation. Models learned with Gibbs_SBA also generalize better than the models learned by MLE on real-world Senate voting data.
Duke Scholars
Published In
ISSN
Publication Date
Volume
Start / End Page
Location
Related Subject Headings
- 4611 Machine learning
- 1702 Cognitive Sciences
- 1701 Psychology
Citation
Published In
ISSN
Publication Date
Volume
Start / End Page
Location
Related Subject Headings
- 4611 Machine learning
- 1702 Cognitive Sciences
- 1701 Psychology