Scalable bayesian non-negative tensor factorization for massive count data

Published

Conference Paper

© Springer International Publishing Switzerland 2015. We present a Bayesian non-negative tensor factorization model for count-valued tensor data, and develop scalable inference algorithms (both batch and online) for dealing with massive tensors. Our generative model can handle overdispersed counts as well as infer the rank of the decomposition. Moreover, leveraging a reparameterization of the Poisson distribution as a multinomial facilitates conjugacy in the model and enables simple and efficient Gibbs sampling and variational Bayes (VB) inference updates, with a computational cost that only depends on the number of nonzeros in the tensor. The model also provides a nice interpretability for the factors; in our model, each factor corresponds to a “topic”. We develop a set of online inference algorithms that allow further scaling up the model to massive tensors, for which batch inference methods may be infeasible. We apply our framework on diverse real-world applications, such as multiway topic modeling on a scientific publications database, analyzing a political science data set, and analyzing a massive household transactions data set.

Full Text

Duke Authors

Cited Authors

  • Hu, C; Rai, P; Chen, C; Harding, M; Carin, L

Published Date

  • January 1, 2015

Published In

Volume / Issue

  • 9285 /

Start / End Page

  • 53 - 70

Electronic International Standard Serial Number (EISSN)

  • 1611-3349

International Standard Serial Number (ISSN)

  • 0302-9743

International Standard Book Number 13 (ISBN-13)

  • 9783319235240

Digital Object Identifier (DOI)

  • 10.1007/978-3-319-23525-7_4

Citation Source

  • Scopus