Skip to main content
Journal cover image

Spectral learning on matrices and tensors

Publication ,  Journal Article
Janzamin, M; Ge, R; Kossaifi, J; Anandkumar, A
Published in: Foundations and Trends in Machine Learning
January 1, 2019

Spectral methods have been the mainstay in several domains such as machine learning, applied mathematics and scientific computing. They involve finding a certain kind of spectral decomposition to obtain basis functions that can capture important structures or directions for the problem at hand. The most common spectral method is the principal component analysis (PCA). It utilizes the principal components or the top eigenvectors of the data covariance matrix to carry out dimensionality reduction as one of its applications. This data pre-processing step is often effective in separating signal from noise. PCA and other spectral techniques applied to matrices have several limitations. By limiting to only pairwise moments, they are effectively making a Gaussian approximation on the underlying data. Hence, they fail on data with hidden variables which lead to non-Gaussianity. However, in almost any data set, there are latent effects that cannot be directly observed, e.g., topics in a document corpus, or underlying causes of a disease. By extending the spectral decomposition methods to higher order moments, we demonstrate the ability to learn a wide range of latent variable models efficiently. Higher-order moments can be represented by tensors, and intuitively, they can encode more information than just pairwise moment matrices. More crucially, tensor decomposition can pick up latent effects that are missed by matrix methods. For instance, tensor decomposition can uniquely identify non-orthogonal components. Exploiting these aspects turns out to be fruitful for provable unsupervised learning of a wide range of latent variable models. We also outline the computational techniques to design efficient tensor decomposition methods. They are embarrassingly parallel and thus scalable to large data sets. Whilst there exist many optimized linear algebra software packages, efficient tensor algebra packages are also beginning to be developed. We introduce Tensorly, which has a simple python interface for expressing tensor operations. It has a flexible back-end system supporting NumPy, PyTorch, TensorFlow and MXNet amongst others. This allows it to carry out multi-GPU and CPU operations, and can also be seamlessly integrated with deep-learning functionalities.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

Foundations and Trends in Machine Learning

DOI

EISSN

1935-8245

ISSN

1935-8237

Publication Date

January 1, 2019

Volume

12

Issue

5-6

Start / End Page

393 / 536
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Janzamin, M., Ge, R., Kossaifi, J., & Anandkumar, A. (2019). Spectral learning on matrices and tensors. Foundations and Trends in Machine Learning, 12(5–6), 393–536. https://doi.org/10.1561/2200000057
Janzamin, M., R. Ge, J. Kossaifi, and A. Anandkumar. “Spectral learning on matrices and tensors.” Foundations and Trends in Machine Learning 12, no. 5–6 (January 1, 2019): 393–536. https://doi.org/10.1561/2200000057.
Janzamin M, Ge R, Kossaifi J, Anandkumar A. Spectral learning on matrices and tensors. Foundations and Trends in Machine Learning. 2019 Jan 1;12(5–6):393–536.
Janzamin, M., et al. “Spectral learning on matrices and tensors.” Foundations and Trends in Machine Learning, vol. 12, no. 5–6, Jan. 2019, pp. 393–536. Scopus, doi:10.1561/2200000057.
Janzamin M, Ge R, Kossaifi J, Anandkumar A. Spectral learning on matrices and tensors. Foundations and Trends in Machine Learning. 2019 Jan 1;12(5–6):393–536.
Journal cover image

Published In

Foundations and Trends in Machine Learning

DOI

EISSN

1935-8245

ISSN

1935-8237

Publication Date

January 1, 2019

Volume

12

Issue

5-6

Start / End Page

393 / 536