Skip to main content
Journal cover image

Tensor decompositions for learning latent variable models (A survey for ALT)

Publication ,  Conference
Anandkumar, A; Ge, R; Hsu, D; Kakade, SM; Telgarsky, M
Published in: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
January 1, 2015

This note is a short version of that in [1]. It is intended as a survey for the 2015 Algorithmic Learning Theory (ALT) conference. This work considers a computationally and statistically efficient parameter estimation method for a wide class of latent variable models— including Gaussian mixture models, hidden Markov models, and latent Dirichlet allocation—which exploits a certain tensor structure in their low-order observable moments (typically, of second- and third-order). Specifically, parameter estimation is reduced to the problem of extracting a certain (orthogonal) decomposition of a symmetric tensor derived from the moments; this decomposition can be viewed as a natural generalization of the singular value decomposition for matrices. Although tensor decompositions are generally intractable to compute, the decomposition of these specially structured tensors can be efficiently obtained by a variety of approaches, including power iterations and maximization approaches (similar to the case of matrices). A detailed analysis of a robust tensor power method is provided, establishing an analogue of Wedin’s perturbation theorem for the singular vectors of matrices. This implies a robust and computationally tractable estimation approach for several popular latent variable models.

Duke Scholars

Published In

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

DOI

EISSN

1611-3349

ISSN

0302-9743

ISBN

9783319244853

Publication Date

January 1, 2015

Volume

9355

Start / End Page

19 / 38

Related Subject Headings

  • Artificial Intelligence & Image Processing
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Anandkumar, A., Ge, R., Hsu, D., Kakade, S. M., & Telgarsky, M. (2015). Tensor decompositions for learning latent variable models (A survey for ALT). In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9355, pp. 19–38). https://doi.org/10.1007/978-3-319-24486-0_2
Anandkumar, A., R. Ge, D. Hsu, S. M. Kakade, and M. Telgarsky. “Tensor decompositions for learning latent variable models (A survey for ALT).” In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 9355:19–38, 2015. https://doi.org/10.1007/978-3-319-24486-0_2.
Anandkumar A, Ge R, Hsu D, Kakade SM, Telgarsky M. Tensor decompositions for learning latent variable models (A survey for ALT). In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 2015. p. 19–38.
Anandkumar, A., et al. “Tensor decompositions for learning latent variable models (A survey for ALT).” Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 9355, 2015, pp. 19–38. Scopus, doi:10.1007/978-3-319-24486-0_2.
Anandkumar A, Ge R, Hsu D, Kakade SM, Telgarsky M. Tensor decompositions for learning latent variable models (A survey for ALT). Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). 2015. p. 19–38.
Journal cover image

Published In

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

DOI

EISSN

1611-3349

ISSN

0302-9743

ISBN

9783319244853

Publication Date

January 1, 2015

Volume

9355

Start / End Page

19 / 38

Related Subject Headings

  • Artificial Intelligence & Image Processing