Bayesian learning of joint distributions of objects

Published

Conference Paper

Copyright 2013 by the authors. There is increasing interest in broad application areas in defining flexible joint models for data having a variety of measurement scales, while also allowing data of complex types, such as functions, images and documents. We consider a general framework for nonparametric Bayes joint modeling through mixture models that incorporate dependence across data types through a joint mixing measure. The mixing measure is assigned a novel infinite tensor factorization (ITF) prior that allows flexible dependence in cluster allocation across data types. The ITF prior is formulated as a tensor product of stick-breaking processes. Focusing on a convenient special case corresponding to a Parafac factorization, we provide basic theory justifying the flexibility of the proposed prior and resulting asymptotic properties. Focusing on ITF mixtures of product kernels, we develop a new Gibbs sampling algorithm for routine implementation relying on slice sampling. The methods are compared with alternative joint mixture models based on Dirichlet processes and related approaches through simulations and real data applications.

Duke Authors

Cited Authors

  • Banerjee, A; Murray, J; Dunson, DB

Published Date

  • January 1, 2013

Published In

Volume / Issue

  • 31 /

Start / End Page

  • 1 - 9

Electronic International Standard Serial Number (EISSN)

  • 1533-7928

International Standard Serial Number (ISSN)

  • 1532-4435

Citation Source

  • Scopus